How we used routine, administrative data to evaluate the outcomes of a key government programme

Two government analysts reflect on their experiences using existing data to evaluate policies – in this case, the Troubled Families Programme.

09 . 09 . 2019

In the recent book on evidence use, 'What Works Now', the authors highlight the potential contribution of routine administrative data to evidence and policy. This blog provides an example of the successful use of linked administrative data to evaluate the Troubled Families Programme. It also highlights some of the challenges of linking and using administrative data.

What is the Troubled Families Programme?

The Troubled Families programme is one of the biggest social policy programmes in government. It is a £920m programme, run across England in 149 upper tier local authority areas and provides services to 400k families with multiple, high-cost problems including worklessness, domestic abuse, mental and physical ill health, school truancy and anti-social behaviour and offending.

The challenge of robust impact evaluation?

Impact evaluations, like the national evaluation of the Troubled Families Programme, aren’t common. A National Audit Office (NAO) study from five years ago reviewed 6,000 analytical reports by government and found that only 305 (5 per cent) were impact evaluations. Of these studies only 2 per cent were assessed as being of good quality in that they included a counterfactual (or comparison group), and only 1 per cent included a cost benefit assessment.

How did we evaluate the programme?

MHCLG’s Troubled Families Analysts were given a project brief to include all local authority areas running the programme and to carry out a robust impact evaluation (with a counterfactual) and cost benefit analysis – all within a limited budget.

The tough brief for an evaluation of this scale was a catalyst for innovation. Traditional data collection methods, such as surveys, were out of the question as it would have been prohibitively expensive to sample and collect data from every area. We soon realised that we needed a completely different approach.

We decided to use linked administrative data to conduct an impact analysis of the programme. The basic plan was to collect personal data, such as names, dates of birth and addresses of families either on the programme or those awaiting a place (these formed a comparison group for the evaluation) and then match this information to various separate administrative datasets including the Police National Computer, the National Pupil Database, HM Revenue and Customs tax records and the DWP benefits system. We wanted full historical data as well on each individual.

We then used propensity score matching methods to compare outcomes of the intervention and comparison groups. Other policy programmes had been evaluated in a similar way in the past, but never to the same scale, or in combination with so many different government databases or other sources of information.

What were the challenges?

It took two years to navigate the data protection legislation and identify and agree the legal gateways to share and use the data. Indeed, at one point the team was one of the biggest users of legal services in MHCLG, and had more meetings with lawyers than analysts.

We learned along the way that innovation brings risk which can make people (quite rightly) wary. This meant keeping key stakeholders (such as local authorities) updated, outlining the benefits and providing reassurance on the risks.

What it worth it?

Eventually this paid off and the end of the project saw one of the biggest data sharing exercises in Whitehall. There were 149 areas providing data on a regular basis, and we managed to link data on 820k individuals in 373k families.

We have processed several terabytes of data on the ONS super computer and assembled nearly 5k variables on every individual in the dataset, with rigorous security and data processing controls to ensure that the data is protected. We were always conscious that one security breach would put an end to this new way of using government data.

We now have a robust evaluation of outcomes. The national impact evaluation has demonstrated the success of the programme, showing that, after two years:

  1. The programme reduced by a third the proportion of children going into care.
  2. Juvenile convictions were down by 15 per cent, with juvenile custody down by 38 per cent and adult custody  down by a quarter.
  3. The number of working age Jobseekers' Allowance claimants looks to have reduced by 11 per cent.
  4. The Cost Benefit Analysis found for every £1 invested by central government in the programme there is an economic and social (or public value) benefit of £2.28, and a fiscal benefit of £1.51.

These are impressive rates of return on a social policy programme and something to celebrate, something that would not have been uncovered without us putting a proper impact evaluation in place.

Ricky Taylor and Lan-Ho Man are Senior Analysts at the Ministry for Housing, Communities and Local Government

Click here to read more about the evaluation and findings.

Don’t miss out!

Join our online community to receive news on latest thinking, research, events, trainings, and our regular newsletter.