Skip to content
Blog

Do we know or care about what works for local growth?

arrow down
norali-nayla-PDKbClhMxqY-unsplash (1)

Do we know or care about what works for local growth? Last Wednesday Henry Overman talked in front of a home crowd at LSE on this topic along with Alexandra Jones of the Centre for Cities and David Halpern of the Behavioural insights team, in a discussion chaired by Lynne Miles of Arup. It was encouraging to have such a large and engaged audience listening to our findings and some of their implications.

For this event Henry went back to the initial principles that underpin our work: the drivers of economic growth in the UK favour some places and people over others. The government has spent a great deal of money attempting to help counteract this force and improve prosperity for those places and people that are falling behind. However, even when looking at trends from 1998 – 2008 – years of strong overall economic growth and relatively large governmental spending – the wage inequality between, for example, cities in the north and the southeast is persistent.

We do not know, of course, if the gap would be worse without government policies for economic growth. What we do know is that there are some areas where conventional wisdom about economic activity is not upheld by the evidence. Sport and cultural spend, for example, although important and desirable for many reasons, cannot be expected to generate lasting economic benefits to places with stagnant underlying economies. And the benefits to firms from access to finance programmes do not appear to translate into growth in their local economy. For each policy area that we have reviewed, we have list of lessons that can be learned for policymakers to improve their odds of making the impact they want to see. These lessons may not always be as clear cut as ’ this works, and that does not’. But it does reinforce the value of even a relatively small evidence base and the importance of more experimentation and testing to figure out what works.

David Halpern’s answer to the title questions: do we know or care what works? was ‘not enough’ to both. But he was insistent that we can find out.  Medicine and education provide excellent role models for developing a robust approach to evaluation of practices which can lead to better solutions. Experimenting in implementation on a more regular basis can at the very least provide some insight into which approaches offer the best value for money. He cited a recent study by BIT which found that sending text messages to students at FE colleges encouraging them to study, plan their journey to college, or remember to get back in the swing after a break had a significant impact on attendance. Students receiving the texts were 7% more likely to attend class than whose who did not. Drop out rates among this group were a stunning 30% lower. A simple and inexpensive intervention to implement can offer real improvements.

Alexandra Jones offered two important practical assessments. Firstly, ‘fairness’ is a tenet of political dialogue, and a fundamental goal of government. But at the most basic level, for example spending economic development money evenly across places, may not always deliver the results that people really care about. The evidence is clear that in some cases focusing on growing successful places will generate much stronger growth overall than trying to generate growth in places that are lagging. Building the skills of the people in those places is a better bet for reducing unemployment in those places.

Her second point is that the nuances of economic policy are  challenging to communicate in a political environment. Where the evidence does not resonate with people’s sense of how things work we may need to find new ways of getting our messages across. This has been an important goal for us, and one that we will continue to grapple with now that our evidence reviews are complete.

Obviously, we do care what works – the clue is in the name. Our challenge now is to continue to  expand and improve the evidence base, find better ways to communicate what the evidence says, and help policymakers to design programmes that make the most of what we know works.