If it’s not in the budget it won’t happen

Why evidence and policy are entangled inside government departments

12 . 09 . 2025

Over the years I have read many papers whose authors have asked why policy officials haven’t adopted a recommendation that would improve their use of evidence. Most report that one of the things officials say is ‘budget constraints’, and the analysis ends there. But what lies behind that simple phrase?

The EIPM literature is paying increasing attention to insider perspectives on how evidence is used inside government departments (see for example Lorenzo Marvulli’s PhD thesis, Jo Maybin’s monograph, and the Twende Mbele partnership). However, little work has been done on how the organisational aspects of life inside a department shape how civil servants work to build and manage a department’s evidence base and use the evidence they have. That’s what interested me when I went to work in the UK’s Department of Environment, Food and Rural Affairs (Defra) in the early 2000s, what has kept me in the field ever since, and what formed the basis of my recent PhD.

The ‘evidence budget’ is a constantly negotiated process

Kathryn Oliver and colleagues have shown how complex the research-policy system is, and how ‘policy’ is supported by a range of other functions. In her useful diagram, she includes ‘funding’ – how much a policy is going to cost to implement.

While departments are concerned with the cost of ‘the policy’, they also operate as organisations with internal costs; the policymaking process itself runs on a budget which is allocated to each department by the national Treasury or Ministry of Finance. There’s the cost of running the department, of acquiring the evidence the department needs, and of staffing policy teams with sufficient capacity to handle the evidence acquired. If a department doesn’t have people who understand what evidence is needed to answer policy questions, can interpret the evidence when it arrives on their desks, and feed it effectively into the policy process, then you can’t be sure that the evidence will be well used. All of this, of course, needs a budget.

Inevitably, within each department the total cost of all requests for evidence will be greater than the overall budget the department has been allocated. And, if policy priorities change suddenly a department may need to hire more specialists in a particular area (for example, Brexit raised the demand for a range of specialist skills to handle border and trade issues). As a department has a limited staffing budget, what should it do?

Managing the budget is all about negotiating priorities. My research found that in Defra the ‘evidence budget’ was not a fixed upper limit but a constantly negotiated process. Senior officials had to share resources with some colleagues and bid against others, working out where they could use some of their budget to access evidence gathered by other organisations (such as from non-departmental public bodies, or academia). These negotiations were constant. For example, an externally contracted project to produce evidence for one team might be delayed by 3 months, which would free up resources that could be used to plug an immediate gap in another team’s budget because their policy area had become a hotter topic. In Defra, the Chief Scientist’s Office, which oversaw the evidence budget, monitored each team’s spending profile every month, working out what current priorities for evidence were, how to balance them with new needs for evidence, and how to extract maximum value from the limited budget.

Evidence, policy and the budget

In fact, it’s important to understand how Defra’s approach to using evidence was institutionalised through the complex, contentious process of managing the evidence budget. Defra’s evidence specialists (natural and social scientists who, as Ben Hepworth notes, have very different roles from Defra’s ‘policymakers’) developed a range of knowledge brokering approaches to work out what evidence would be needed when and how it could be best sourced. They used this brokering to create narratives that supported their annual bids for the budget they would need to gather and use evidence to support their policy work.

So, how evidence was brokered inside Defra was framed by the budget they had to work with – by the strategic narratives that specialists developed to argue for evidence in one area over another, or for one type of evidence over another. I describe this relationship as one of ‘collaborative entanglement’; it’s a relationship that helps us get to heart of the question that permeates much of the literature on evidence-informed policymaking: who should lead the evidence-to-policy process?

“Who leads”? The issue of collaborative entanglement

Collaborative entanglement helps us understand that the answer to this question is not either/or. The question of “who leads” depends in part on who is accountable for ensuring the evidence budget is well spent: policymakers or evidence specialists. Over the course of a decade Defra developed a complex, carefully crafted middle ground in which accountability for both policy quality and the quality of the evidence base was framed during budget negotiations. Nobody I interviewed thought the process provided the optimum allocation for their purposes but overall, this way of managing the evidence budget made both sides accountable and was a satisficing attempt to deliver ‘good enough’ outcomes for both groups.

In any policymaking department, discussions about what evidence is needed will be intertwined with negotiations about who could provide it most cost-effectively, what current policy priorities are, what budget is currently available and who should be accountable for spending it wisely. Understanding these negotiations is key to understanding how evidence and policy relate to one another inside departments and how evidence is ultimately used in the policy process.

Learning for research: studying collaborative entanglement

Accessing these discussions is difficult for researchers, precisely because they are complex, contentious and shaped by internal organisational politics. An emerging literature on the politics of priority setting for resource allocation in policy programming may contain useful insights that could be applied to the policymaking space. But there are also real opportunities for researchers who are themselves entangled in the policy process, through secondments or placements, to observe how the budget for evidence is negotiated. Reflecting these observations back to departments as part of a shared learning process could derive greater insights into evidence-policy relationships.

But in the meantime, the next time a policy official says that ‘budget constraints’ are the reason they haven’t adopted a recommendation for improving the use of evidence, I would simply encourage you to dig a little deeper into that phrase.

Don’t miss out!

Join our online community to receive news on latest thinking, research, events, trainings, and our regular newsletter.