What works in getting evidence used in policy, decision-making and practice?

Two experts from the Early Intervention Foundation summarise some of the varied approaches taken by this 'What Works' centre to improve evidence use.

18 . 10 . 2019

One of EIF’s core functions is to see that evidence on early intervention is actually being used in policy, decision-making and practice. We are committed to making ongoing improvements in how we work when the evidence moves ‘beyond publication’.

A key part of being a What Works centre is working out how best to communicate evidence and supporting different audiences to use that evidence. But as a recent article, 'What Works now? Continuity and change in the use of evidence to improve public policy and service delivery', points out, 20 years on from the beginnings of the 'What Works' movement in the UK, we still don’t know a great deal about what works in getting evidence used.

We don’t know enough about what works in getting evidence used

While recent years have seen growing experimentation by What Works centres and lots of other research organisations with different ways of building relationships between evidence producers and users, the reality, as the article points out, is that "many evidence-use initiatives are not well documented, let alone evaluated". With a few notable exceptions such as the EEF Octopus trials, we don’t tend to evaluate different research-use strategies. The result is that we are all working with a fairly limited evidence-base on the use of evidence itself. This is slowly starting to change, and publications such as 'The Science of Using Science' and 'What Works Now? Evidence-Informed Policy and Practice' are an important step forward.

Since our foundation as a What Works centre in 2013, EIF has been experimenting with different ways of getting evidence used. We’d be the first to admit that we’ve been doing this largely without reference to the literature on evidence use that does exist. We are keen to bridge the divide between practitioner-facing organisations like ours, which are experimenting with ways of doing this but often not testing what we do, and the academics who are seeking to grow the discipline of evidence use or "knowledge mobilisation" (to give it its more academic label). We want to be part of a joined-up movement to better understand the barriers to evidence use within the system we’re operating in, and to assess the diverse set of methods designed to bring down these barriers and promote evidence use.

EIF’s approach to using evidence

At EIF, our mission is to ensure that effective early intervention is available and is used to improve the lives of children and young people at risk of poor outcomes. We do this by making the case for early intervention, by generating and synthesising evidence, and by getting evidence used to change policy and practice, both nationally and locally.

We have always gone beyond simply publishing evidence, and we’ve been experimenting with different mechanisms to communicate and support its use since 2013. Our new strategy (2018–2023) puts even more emphasis on trying out different ways to proactively engage our audiences with the evidence we produce and to support them to think about the implications for local service delivery. In this blog, we bring out some examples of the methods we have tried.

People sometimes say "the evidence speaks for itself", but in our experience it really doesn’t. The evidence is often inconclusive, and the ‘so what’ implications of new research can be hard to pin down.

Bringing evidence to life and making it accessible

People sometimes say "the evidence speaks for itself", but in our experience it really doesn’t. The evidence is often inconclusive, and the ‘so what’ implications of new research can be hard to pin down. We therefore spend a lot of time at EIF (as others do) ‘translating’ the evidence. This involves thinking about what findings actually mean for local commissioning, service development and practice, and then turning this into tools and guides that are directly relevant to specific local delivery contexts. This isn’t always possible: sometimes there are too many questions that the evidence can’t answer (for now). Current EIF examples of these resources include planning tools, maturity matrixes, commissioner guides, briefings, webinars and – most ambitiously – bringing these resources together in dedicated online hubs, where local practitioners can find a range of information on a particular area of early intervention, such as early years system transformation or reducing parental conflict.

Taking the evidence out to our audiences

It’s obvious that if we want people to use the evidence then we need to put it in their hands. Taking the evidence out into the world is crucial, whether through events, seminars, workshops or other methods. We’ve been trying to understand what can be achieved through these ‘single hit’ methods. We know that expecting everyone to change their practice after a one-off event is unrealistic, but it is still important to understand what we can hope to achieve this way.

We have tested this potential in the past. In 2016 we published a major evidence review of 75 interventions which aimed to support positive parent–child interactions in the early years. Then, to support the report, we delivered events at a range of sizes, which engaged three-quarters of local authorities.

We were able to gauge the effectiveness of these events through a series of follow-up surveys (see our 2017/18 annual report for more detail on this work). We learned that four to six months after attending our conference event, 60% of participants said they had reflected on their own practice a lot, 15% had changed their behaviour a lot and 55% changed their behaviour a little. We were particularly interested in whether an intention to use the evidence expressed immediately after a learning event actually translated into action in the longer term. While recognising the limitations of self-report data based on recall, we were encouraged to see that some participants said this had happened. Four to six months after our evidence seminar events, 65% of respondents reported that they had used EIF evidence in the past six months, 93% had discussed best practice with colleagues, and 90% had reflected on their own practice.

Partnering with sector and workforce bodies

Sometimes the best way to be heard is to communicate through and with the organisations our audiences are already listening to. For example, in our efforts to reach councillors who are lead members for children, we have worked with the Local Government Association (LGA) to build our evidence into their professional learning and development programme and to produce resources for councillors, such as an early help resource pack.

Working with local places and running networks

We work directly with local places and regularly facilitate networks, bringing together people involved in similar roles to discuss applying evidence to their work. Our most ambitious example to date is the Early Years Transformation Academy, which brings together teams of local system leaders responsible for maternity and early years services from five areas (Dudley, Barking and Dagenham, Norfolk, Westminster, Kensington and Chelsea and Sandwell) to use the evidence to explore new approaches, consolidate local planning, access independent support and challenge, and invest in leadership development. We are planning to evaluate the impact of the academy for participants, and to test the feasibility of this model as a vehicle for driving evidence use. In the past, we’ve run networks for police leaders involved in early intervention and reducing parental conflict ‘pioneers’.

Exploring the potential of behavioural science

We are beginning to experiment with using behavioural science to help us think about ways to get our evidence used. This involves starting work to drive use of evidence by exploring the current attitudes and behaviours of specific audiences and using this to guide follow-on work. For example, we recently published guidance for primary schools on evidence-based social and emotional learning(SEL), in partnership with the Education Endowment Foundation, and have been collaborating on an accompanying project to explore the barriers that schools might face in adopting our recommendations. Qualitative research, using the COM-B behaviour change model developed by Professor Susan Michie, has helped us to appraise how schools view SEL activity and the barriers they perceive to using evidence, and to design a knowledge mobilisation plan that actively responds to these findings.

This work has been supported by the Economic and Social Research Council (ESRC), who are interested in our initial reflections on the value of this kind of process to the What Works network. This work is in the early stages for us, and current funding will not permit any serious evaluation of the methods we use. But already we see the methodology as offering real potential. It has helped us to avoid the trap of making assumptions about what stops schools engaging with the evidence on SEL, and enabled us to tailor our initial communications messages to respond directly to some of the barriers identified.

For example, we could easily have assumed that there was a motivation barrier to be overcome: that we needed to convince schools that SEL was important. However, our qualitative work gave us a clear understanding of the nuances here: that primary schools already see SEL as very important, and, crucially, that they see it as something they are already doing, and doing well. In reality, the message we need to land is about the importance of high-quality implementation of SEL, and the need for schools to engage with the evidence. As a result, we are now focusing on outputs which set out what ‘delivering SEL well’ looks like, rather than those which explain ‘why SEL matters’.

Where next for learning what works in evidence use?

Over the years, we have developed a variety of methods for encouraging use of the evidence for early intervention at EIF, but these methods have largely been developed ‘in house’, without drawing on or contributing to the literature of what works in evidence use. It can often be difficult to find the resources to monitor whether the approaches we are using are working to change behaviour and get evidence used. This needs to change. We are committed to being more considered in our use of evidence-use methods, and to do more to learn what works best in driving evidence use among our particular audiences.

We are serious about getting evidence used. This means a major focus for us over the next few years will be to work with other What Works centres and academics interested in evidence use, to make sure that we are contributing to both the theory and practice taking the next leap forward.

Jo Casebourne and Donna Molloy are, respectively, Chief Executive, and Director of Policy and Practice, at the Early Intervention Foundation.  This blog was originally posted on the EIF website.


Don’t miss out!

Join our online community to receive news on latest thinking, research, events, trainings, and our regular newsletter.