Our latest evidence review on the economic impact of cultural and sport projects might make for uncomfortable reading for some local decision makers. We looked at these programmes’ effects on wages and employment and found that these tended to be small and more likely zero. How might we reconcile this with reports that tend to suggest large benefits for the local economy?
The first thing to note is that we focused on evaluations that measure the effects on employment and wages after the event is held, or facility built. Many other reports will approach this differently – often starting with additional visitor numbers (i.e. people that visit the area specifically because of the project) and then using various assumptions to get to the employment and wage effects from there.
Figuring out additional visitor numbers directly linked to (say) an event is tricky, as ‘mega-events’ may simultaneously put people off as well as attracting visits. Thus, some very large events can be associated with a fall in overall visitors – as appears to have happened with the 2012 Olympics.
Assuming visitor numbers are up year-on-year, we need to decide how much of this increase can be attributed to the project. Most reports rely on surveys of visitors. However, it is notoriously difficult to jump from these statements about reasons for visiting to a viable estimate of the number of additional visitors.
This is compounded by the fact that we don’t have any evaluations that assess whether projects caused increased visitor numbers. As we explain in our evidence review, establishing the causal impact of a project on visitor numbers requires the construction of a valid counterfactual – i.e. what would have happened to an area if it had not hosted the event or built the facility. A standard approach is to create a counterfactual group of similar places not undertaking this kind of project. Changes in outcomes can then be compared between the ‘treatment group’ (locations affected by the event/facility) and the ‘control group’ (locations not affected). Unfortunately, most evaluations – particularly for visitor numbers – simply don’t provide these kinds of comparisons.
Once we have an estimate of additional visitor numbers, then we need to get from this figure to an estimate of the likely impact they will have on the local economy. Again, most studies try to achieve this through surveys – e.g. by asking people how much they are going to spend. They then apply a technique called input-output modelling to figure out the impact on employment and wages as that spending flows throughout the local economy.
Each step in this chain involves a lot of uncertainty. How many visitor numbers are additional? How much do they spend? How does that get translated in to local employment?
A good example of this approach in action can be found in one of the reports looking at the impact of Liverpool’s year as City of Culture.
That report suggests that of the 27.5m visitors to Liverpool in 2008, 14.7m were ‘influenced’ by the City of Culture project. After some manipulation to remove ‘usual’ visits the report suggests a little under 10m additional visits, which is estimated to have generated about £800m of expenditure which in turn supported 13,000 jobs in the Liverpool City Region.
That seems like a fairly substantial economic impact. So how do we reconcile that with our general finding that these kinds of projects tend to generate little or no effect?
The first possibility is that Liverpool City of Culture did generate additional jobs. After all, our evidence review found that around a quarter of the (mostly sport related) projects we considered did generate a small positive impact on employment. Perhaps this was one of those 25% of successful projects.
The second possibility is that the City of Culture study overstates the impact because it doesn’t provide a proper impact evaluation based on a comparison to a control group. For example, you would expect most of the job increases to show up in the retail or tourism sectors. Yet neither sector shows particularly large jumps in 2008, certainly not when compared to other core cities (see figure 12, p.28 of this report.
In fact, if you look at aggregate statistics for the Liverpool City Region: total employment for 2007 was 635,000; for 2008 was 621,000; for 2009 was 619,000 and for 2010 was 628,000. It’s certainly hard to see the 13,000 additional jobs in the 2008 figures. Of course, the situation might have been even worse without those additional jobs. However, the figures for Liverpool City Region look even less impressive if you compare to the employment trend in the North West as a whole.
All of this is just a complicated way of pointing out that, in the absence of a comparison group, it’s hard to know what to make of the claim that the City of Culture project supported an additional 13,000 jobs. And this is not to pick on Liverpool: we could run a similar exercise for the 2012 Olympics, using some of the many studies claiming a substantial economic gain from the Games.
Sporting and cultural projects certainly have intrinsic value – and given the evident well-being benefits, we shouldn’t be relying on economic rationales for spending public money on them. But claims about their instrumental value in boosting the local economy rely on a shaky evidence base. We can do more to improve the quality of that evidence base – particularly by looking directly at the employment effects of projects against reasonable comparison groups. In the absence of more careful work, the best bet on the available evidence base is that employment and wage effects are likely to be small – and more likely zero.