How Do Policy-Makers Use Evidence? Findings From A Pilot Study In Australia: International Implications

Research output: Contribution to conferencePoster


Calls for the use of evidence in education policy have become increasingly widespread nationally and internationally. Despite this, there have been relatively few in-depth studies of how, when and under what conditions policy-makers use research evidence. This paper will share findings emerging from a one-year study with the Victorian Department of Education and Training (DET) in Australia on their use of evidence in policy development. Drawing on in-depth interviews and documentary analysis with policy writers, advisors and researchers who worked on two specific policies, it sheds new light on how evidence is (and is not) used and the complexities that are involved in the process. It makes links to wider debates around the ‘political uses’ of expert knowledge (Boswell, 2009). Specifically, this paper will: provide in-depth examples of how different kinds of evidence were used in the development of two specific recent policies in Victoria (What types of evidence have been used, in what ways and for what purposes?) share policy-makers’ reflections on the complexities involved in identifying, evaluating and using evidence with these policies (What challenges are involved in using evidence, and what helps and/or hinders evidence use to happen?) explore theoretical connections and points of difference between this work and wider debates around the ‘political uses’ of expert knowledge (Boswell, 2009) and the role of ‘policy narratives’ (Boswell et al., 2011). A number of theoretical influences informed this project. Firstly, an important conceptual starting point has been a desire to study ‘the use of evidence’ as opposed to ‘the impact of research’. This picks up on a distinction highlighted by Weiss in the late 1970s, who noted that social scientists tend to ask ‘how can we increase the use of research in decision making?’ Weiss argued that it would be preferable to study ‘how can we make wiser decisions, and to what extent, in what ways, and under what conditions, can social research help’ (Weiss, 1978, p. 78). Secondly, the work reported in this paper has been influenced by studies that have used in-depth empirical exploration to develop theoretical models of policy enactment and evidence use. The work of Ball, Maguire, and Brainet (2012) is informative in its rich conception of policy ‘enactment’ as opposed to ‘implementation’, which recognizes ‘the diverse and complex ways in which sets of education policies are made sense of, mediated and struggled over, and sometimes ignored’ (p. 3). Our work seeks to bring a similar breadth of perspective to the study by exploring how policy-makers ‘do’ evidence. Similarly, there are connections with the way Earl and Timperley (2009) have studied evidence use in educational practice – in particular, their focus on the detail of ‘how educators at all levels actually use evidence in their thinking and their decision-making’ and their development of a ‘theoretical model that describes the qualities of productive evidence-informed conversations’ (p. 2). Thirdly, the pilot study takes note of relevant theoretical and empirical insights emerging from wider evidence-use literature. In particular: (i) typologies of research use that distinguish between ‘instrumental’ research (providing answers), ‘conceptual’ research (raising questions) and ‘strategic’ research (as ammunition) uses (Estabrooks, 2001); (ii) work on the political uses of expert knowledge that flag up the importance of knowledge as ‘a source of legitimation’ for policy organisations and as ‘a source of substantiation’ of policy preferences (Boswell, 2009, p. 87); (iii) characterisations of evidence use that emphasise its ‘interactive, iterative and contextual’ nature (Davies, Nutley & Walter, 2008, p. 190); and (iv) the ‘social ecology’ of policy that highlights ‘how government operates, the decision-making processes and the central players who influence what occurs’ (Tseng, 2012, p. 8). Method The pilot study has been developed as a co-funded, collaborative venture between Monash University and the Department of Education and Training. It is designed to build upon the findings of an earlier University of Queensland (UQ) study, which surveyed DET staff as part of a wider investigation into evidence use within Australian public sector agencies (Head, Boreham, & Cherney, 2013). Close to 400 DET staff completed the UQ survey, and their responses showed that ‘research is valued and being used by the majority of staff’ but also that ‘there is compelling evidence of factors that militate against [its] uptake and adoption’ (Head et al., 2013, p. 45). The current pilot study, then, is probing into the issues, challenges and complexities surrounding evidence use within the Department. More specifically, it is exploring the following kinds of questions: • What types of research evidence are used? • Who are the key players in the process? • How, when and where does evidence use happen and not happen? • Why does it happen or not happen (drivers, barriers, influencing factors)? • So what could be done to improve the use of evidence in the future? The above questions are being investigated in the context of three DET policy initiatives, using a combination of documentary analysis and interviews/observation with DET staff. The selection of policy initiatives has been made through in-depth discussion with DET staff in order to: cover different time-scales (i.e. a past initiative, a recent initiative and a current initiative), encompass different aspects of DET’s work (i.e. involve different policy teams), and focus on significant initiatives that are researchable (i.e. involve staff who are open to taking part as interviewees). The focus on past, recent and current policies is deliberate in order to explore slightly different ways of generating in-depth accounts of evidence use in policy development. These include: retrospective interviews with former public servants about past policy development that draw on well-established methods from policy studies such as Walford, (1994), backtracking the process of evidence use from recent policy decisions using approaches developed in studies of research use amongst education practitioners such as Figgis, Zubrick, Butorac, and Alderson (2000), and in-depth case study of current policy development using well-established techniques from research in schools and other institutional contexts such as those used by Calderhead (1981), and Ball (1985). Expected Outcomes Drawing on analysis of the interview data and documentary evidence, this paper will share findings, in-depth examples and emerging ideas around: • how evidence is used – using a ‘policy narratives’ frame (Boswell et al., 2011), a number of discrete uses of evidence in TVLC and SPF will be discussed. These include: uses that seemed to be about working with the narrative (e.g. evidence to flag case for change; evidence to clarify international practice, and evidence to elaborate narrative); and uses that seemed to be about challenging the narrative (e.g. evidence to keep things on the agenda, evidence to challenge assumptions, and evidence to challenge proposals). • why evidence use is complex – three areas of complexity in working with evidence will be examined: accessing a breadth of evidence (e.g., ‘We use a lot that’s easily accessible through a Google search’); evaluating and critiquing evidence (e.g., ‘Not everyone has the skills to interrogate evidence properly’); and making decisions using evidence (e.g., ‘The evidence gets you so far but there's still got to be judgment about what to do’). All of these arose not only as current challenges but also as areas for potential future capacity building. Significance: Against the backdrop of calls for the thoughtful promotion of evidence to shape decision-making about educational improvement as well as all areas of public policy, this paper has the potential to make an important contribution by producing some much-needed clarity about how, when and why educational policy-makers use (and don’t use) research evidence. This can produce better-targeted capacity building strategies and structures around evidence-informed policy-making by illuminating specific practices that can be improved and sustained both in Victoria and internationally. These insights have wider applicability to international practices of reforming education and the roles of policy and educational research. References Ball, S. (1985) Participant Observation with Pupils. In: R.G. Burgess (Ed.), Strategies of Educational Research. Lewes: Falmer Press. Ball, S., Maguire, M. and Brain, A. (2012) How Schools Do Policy. London: Routledge. Boswell, C. (2009) The Political Uses of Expert Knowledge. Cambridge: Cambridge University Press. Boswell, C., Geddes, A. and Scholten, P. (2011) ‘The Role of Narratives in Migration Policy-Making: A Research Framework’, British Journal of Politics and International Relations 13: 1-11. Calderhead, J. (1981) ‘Stimulated Recall: A Method for Research on Teaching’, British Journal of Educational Psychology, 51, 211-217. Davies, H., Nutley, S. and Walter, I. (2008) ‘Why “knowledge transfer” is misconceived for applied social research’, Journal of Health Services Research Policy 13(3): 188-190. DEECD (2012) Towards Victoria as a Learning Community. Melbourne: Department of Education and Early Childhood Development. DEECD (2013) Professional Practice and Performance for Improved Learning: Overview. Melbourne: Department of Education and Early Childhood Development. Earl, L. M. and Timperley, H. (2009) ‘Understanding How Evidence and Learning Conversations Work’, In: Earl, L. M. and Timperley, H. (Eds) Professional Learning Conversations. Dordecht: Springer. Edwards, M. & Evans, M. (2011) Getting Evidence into Policy-making. ANZSIG Insight Report. Canberra: ANZSOG. Estabrooks, C.A. (2001). ‘Research utilization and qualitative research.’ In: Morse, J.M., Swanson, J.M. and Kuzel, A.J. (Eds) The Nature of Qualitative Evidence. London: Sage. Finniganm K.S. and Daly, A.J. (2014) ‘Conclusion: Using Research Evidence from the Schoolhouse Door to Capitol Hill’, In: Finnigan, K.S. and Daly, A. J. (Eds.) Using Research Evidence in Education. Dordecht: Springer. Figgis, J., Zubrick, A., Butorac, A., and Alderson, A. (2000). ‘Backtracking practice and policies to research.’ In: DETYA The Impact of Educational Research. Canberra: DETYA. Head, B., Boreham, P. and Cherney, A. (2013) Public Sector Survey on Evidence based policy: Results for the DEECD. Brisbane: ISSR/UQ. Nutley, S., Walter, I. and Davies, H.T.O. (2007) Using Evidence: How research can inform public services. Bristol: Policy Press. OECD (2007) Evidence and Policy in Education: Linking research and policy. Paris: OECD. Tseng, V. (2012) ‘The uses of research in policy and practice’, Sharing Child and Youth Development Knowledge 26, 2, 1-23. Walford, G. (Ed.) (1994) Researching the Powerful in Education. London: Routledge. Weiss, C.H. (1978) ‘Improving the linkage between social research and public policy’ In: L.E. Lynn (Ed.) Knowledge and Policy: The Uncertain Connection. Washington, DC: National Academy Press.
Original languageEnglish
Publication statusPublished - 2016
EventEuropean Conference on Educational Research 2017 - University College UCC, Campus Carlsberg, Copenhagen, Denmark
Duration: 21 Aug 201725 Aug 2017


ConferenceEuropean Conference on Educational Research 2017
Abbreviated titleECER 2017
City Copenhagen
Internet address

Cite this