1 Associate Professor Adrian Cherney 2nd Australasian Youth Justice Conference September Youth Justice: what hinders moving from evidence to practice Associate Professor Adrian Cherney School of Social Science, University of Queensland
2 Setting the scene: Evidence-based policy and practice (EBPP)Language of EBPP has infused social work, health care, education, criminal justice, mental health. Has it lived up to reality – governments say they are being evidence-based, but is this true? Greenwood & Welsh (2012) – study of early adopters of evidence-based delinquency prevention (e.g. blue-prints) concluded: There are many challenges, including financial, institutional support for pet projects, and the complexity of the coordination and implementation process. Moreover, the rhetoric surrounding evidence-based practice continues to outweigh the reality, with far too many decision makers and advocates having their own interpretation about what constitutes evidence-based. Also, there is cause for concern about the uptake of legitimate evidence-based programs.
3 So why is the uptake of EBPP so difficult?Will step back and look at this broadly. In part informed by two studies – ARC LP grant & evidence-based policing project. Not focusing on specific evidence-based programs (i.e. what works). Observations are directed at policy-makers & practitioners. This is not a call for more research (sort of not). Targeted survey of Australian social scientists n=693 Targeted survey of policy relevant personnel in 21 agencies n=2084 Interviews with a selection of academic respondents n=100 Interviews with a selection of policy personnel n=125
4 3 problems Fidelity & adaption. Risk aversion & experimentation.Organisational imperatives and constraints. Lots of other problems: short-termism; pet-projects; the one pager; policy advisors (gate keepers), staff turnover; memory loss; confirmation bias; interdependence; research units (first to go); academics etc, etc.
5 Knowledge disseminationAssumed model of knowledge transferred – default linear model underpinning EBPP Knowledge creation Knowledge dissemination Knowledge adoption
6 Example: Campbell Collaboration http://www.campbellcollaboration.org/Knowledge creation through systematic reviews (validation process embedded in creation process). Knowledge dissemination: website & reports Knowledge adoption – promoted through user groups.
7 Accommodating variations: Adherence to EBPP: FidelityProblem 1 Information Ideology Interests Institutional support Accommodating variations: Adaptation Adherence to EBPP: Fidelity
8 Fidelity Community psychology definition: adopting research-based programs as they were designed & tested. Important to the integrity of evidence-based programs (particularly for replication). Deviations from agreed program theory or content can distort strategy outcomes (e.g. adding or dropping components can lead to program drift).
9 However: adaptation may be necessaryContext: Argument that programs need to be adapted to fit local conditions. Allowing for local innovation. Need to accommodate tacit knowledge when implementing evidence-based programs – how does it fit with knowledge from experience? BUT… Elliot et al (2004 – Blue Prints for Violence Prevention) state arguments in favour of adaption assumes local contexts are unchangeable.
10 Fidelity/Adaptation DilemmaADAPTATION / MODIFICATION HIGH LOW EBPP Implementation LOW FIDELITY / ADHERENCE HIGH
11 Impact of knowledge hierarchies on EBPP implementationcodified knowledge Tacit knowledge EBPP implementation
12 EBPP involves understanding the role of different knowledge types:Knowing “what works” Knowing “how” Knowing “who” Knowing “why”
13 But this requires experimentation Problem 2All policy is experimental (Banks 2009). Yes – therefore must embrace trial and error (target, track and test). However governments & agencies fearful of trial and error: failure (risk aversion/fall out/political consequences).
14 Outcome of problem 2 The outcome is to default to pet-projects; tinkering; become over stretched. Fear of randomised control trials - someone will miss out! Unwilling to admit mistakes (than look at reasons for failure).
15 Might have access to data/research but prefer to use google.For experimentation you need an enabling organisational context – problem 3 Policy-makers & practitioners more likely to go to a fellow staff member for evidence & research (more valued than academics!) Might have access to data/research but prefer to use google. What drives these behaviours?
16
17 Impact of organisational factorsWhen we modelled these behaviours we found that: academic research results are considered relevant by my colleagues (culture) research is important in my professional field (ethos). difficulty in accessing & understanding full text academic articles and reports (accessibility – cognitive & physical) Were the strongest predictors of research use (was not measuring the quality of research accessed). (Cherney et al 2015).
18 Some answers & recommendations 1Focus on quality EBP implementation = quality outcomes. Invest in the power of few (upskill in EBPP). Target change agents to drive top-down & bottom-up change. Face-to-face learning/communications make a difference. Action learning – engage practitioners in design and implementation.
19 Some answers & recommendations 2Target EB interventions practitioners will understand and adopt. Instigate feedback from practitioners. Get a little bit of EBP into your day (RCTs can be small scale for the purpose of testing practice – nudge experiments). Invest in EBP unit (not just about data). Formal processes to build relationship with researchers. Successful ‘adoption’ of an evidence-based practice depends on supportive processes operating in the background (Ferlie 2005). May not be a guarantee against the influence of politics/ org contingencies, but provides a tipping point for EBPP.