evaluation methods (Crabbé and Leroy, 2012: 121).
Moreover, realistic evaluation provides a consistent
and coherent framework for evaluation in various
stages of policy cycle. In addition, it promotes
maximisation of learning across practice, policy and
organisational boundaries. Overall, realistic
evaluation ‘provides a principle steer from failed one-
size-fits-all ways of responding to problems’ (Pawson
and Tilley, 2004:22). However, this evaluation model
has some distinct limitations. To begin with, there is
no a general formula that can provide step-by-step
instructions for delivering findings. Therefore,
realistic evaluation requires sustained theoretical
understanding, abilities to design techniques and
research to analyse data (Pawson and Tilley,
2004:22). Finally, this evaluation model does not
allow generalisation of findings as the context is
regarded to be one of the most important explanatory
factors (Crabbé and Leroy, 2012: 121). Nevertheless,
realistic evaluation method can be argued to be
promising for future policy evaluations.
Pawson and Tilley (1997:78-82) demonstrate the
practical use of realistic evaluation for assessing the
installation of closed-circuit television (CCTV) as a
measure against the crime in car parks. The authors
providing lists of mechanisms and contexts assess not
only the final outcomes of using CCTV for crime
reduction, but also explain how and under which
conditions the introduction of this particular measure
promotes the decrease of crimes in the car parks.
Thus, for instance, one if the mechanisms outlined is
what the authors refer to ‘memory jogging’. This
mechanism emphasise the role of CCTV as an
indicator for reminding the drivers about the
vulnerability of their cars. Thereby, drivers may be
prompted to take greater care to lock their vehicles, to
remove items that can be easily stolen from view or
to purchase additional security devises.
Evaluation of Aboriginal Parental Engagement
Program (APEP) is another empirical example
demonstrating the beneficial use of realistic approach
to policy evaluation. APEP program was funded by
federal government Department of Education and
Employment Relations of Australia with the purpose
to enhance readiness of 0-5 year old Aboriginal
children for school by increasing the level of parental
engagement in education. Ex-post realistic approach
was conducted both to assess the program outcomes
and to specify the underlying mechanisms and
contexts determining the program impact. The results
of the evaluation demonstrate that multiple
mechanisms and contexts contributes to parental
engagement. Thus, for instance, the analysis of the
surveys enabled evaluators to draw the conclusions
on how the program outcomes varied according the
families’ initial circumstances. The results of this
evaluation can be used by policy-makers and program
developers to improve future policies taking into
consideration defined mechanisms and contexts
(Cargo and Warner, 2013).
REFERENCES
Cargo, M., Warner, L., 2013. Realist evaluation” in action:
a worked example of the Aboriginal Parental
Engagement Program. MELBOURNE:
AUSTRALIAN INSTITUTE OF FAMILY STUDIES.
https://www2.aifs.gov.au.
Carol, H., Weiss, 1998. Evaluation, Second Edition.
LONDON: PRENTICE HALL.
Cousins, J.B., Earl, L.M., 1992. The case for participatory
evaluation. In Educational evaluation and policy
analysis. 14(4). pp. 397-418.
Crabbé, A., Leroy, P., 2012. The handbook of
environmental policy evaluation. In Earthscan.
Estrella, M., Gaventa, J., 1998. Who counts reality? In
Participatory monitoring and evaluation: a literature
review. BRIGHTON: INSTITUTE OF
DEVELOPMENT STUDIES.70.
Garbarino, S., Holland, J., 2009. Quantitative and
qualitative methods in impact evaluation and measuring
results.
Gill, M., Turbin, V., 1999. Evaluating “realistic
evaluation”: evidence from a study of CCTV. In Crime
Prevention Studies. 10(1). pp. 179-199.
Guijt, I., Gaventa, J., 1998. Participatory Monitoring and
Evaluation: Learning from Change.
https://www.ids.ac.uk.
Guijt, I., 2014. Participatory Approaches. In
Methodological Briefs. IMPACT EVALUATION 5,
UNICEF OFFICE OF RESEARCH, FLORENCE.
http://devinfolive.info.
Harris, M. J., 2010. Evaluating public and community
health programs. JOHN WILEY & SONS.
Hogwood, B. W., Gunn, L., 1984. Policy Analysis for the
Real World. LONDON: OXFORD UNIVERSITY
PRESS.
Khandker, S.R., Koolwal, G.B., Samad, H.A., 2010.
Handbook on impact evaluation: quantitative methods
and practices. WASHINGTON, DC: THE WORLD
BANK.
Parsons, W., 1995. Public policy: An introduction to the
theory and practice of policy analysis.
Rao, V., Woolcock, M., 2003. Integrating qualitative and
quantitative approaches in program evaluation. In The
impact of economic policies on poverty and income
distribution: Evaluation techniques and tools. pp. 165-
190.
Pawson, R., Tilley, N., 1997. Realistic Evaluation.
LONDON: SAGE.
Pawson, R., Tilley, N., 2004. Realist Evaluation.
http://www.communitymatters.com.au.