We shouldn’t do policymaking based on vibes. That seems like a position we can all agree on. And yet when we don’t build in the time or structure to examine the evidence, that can be what decision making is left with.
Last week we started our new course, Making use of evidence. We include several real and fictionalised examples where evidence hasn’t been used well:
- In one, the portfolio holder has called for a project to address anti-social behaviour on the high street after receiving three e-mails from the public about recent incidents. Stakeholder engagement provides useful evidence when done well, but the e-mail inbox shouldn’t be the only rationale for a scheme.
- In another, the Director of Growth has asked for examples of successful creative sector incubators. We wouldn’t recommend cherry picking by only looking for successful examples of the type of project that you want to pursue.
- A third scenario involves receiving a report from delivery providers boasting that 50% of participants on a training scheme found employment within 6 months. This, of course, also means that 50% of participants didn’t find employment in that period. And that number may be high or low relative to other providers. In the course, we recommend participants get in the habit of asking “is that a big number?” and look for evidence that helps answer that question.
Although these are made-up scenarios, readers may find that they feel all too familiar. Short deadlines, full in-trays, and a growing range of responsibilities can mean that teams only include evidence that is closest to hand, or request data from an insights team too close to submission dates.
We advocate building in time to find and analyse evidence; our training Making use of logic models focuses first and foremost on understanding the need for a project. Many conversations I have with places are about how to embed evidence into the project management framework or project initiation systems. And to improve access for those short on time, we and other What Works Centres bring together evidence from robust impact evaluations.
At the end of the session we asked participants what concrete steps they could make in the next month to embed the use of evidence in their day-to-day. Responses included carving out time to analyse data, creating folders so that final project reports can be more easily found by colleagues, and a more structured approach to discussing evidence for projects with directors.
To avoid vibes-based policymaking, we need more of these concrete steps to make sure that evidence is available, accessible, and used.