SlideShare a Scribd company logo
Evaluation, Policy & Complexity
How to improve the evaluation of complex systems
to better inform policy-making: Learning from
evaluating Defra’s Reward & Recognition Fund
Energy Evaluation Academy Webinar– June 26th 2019
Sara Giorgi, Research Consultant & Evaluator
Presentation outline
Aims of webinar
What is CECAN?
Research brief – introducing Defra’s RRF
Experience of evaluation
Informing policy
Understanding complexity
Concluding insights
Q&A and discussion
AIMS OF WEBINAR
Aims of webinar: What will you gain from
the session?
1. First to hear the research findings
2. Constructive dialogue, and a knowledge
and practice exchange
∙ Exploring: experience of evaluation, informing
policy and understanding complexity
3. Find out about CECAN
Aims of webinar
What do you expect to get
out of this session?
WHAT IS CECAN?
What is CECAN?
A £3m UK research centre hosted by
the University of Surrey, bringing together
experts to address some of the big issues
in policy-making and evaluation
CECAN pioneers, tests and promotes
innovative evaluation approaches and
methods across policy areas where food,
energy, water and environment intersect
Who is CECAN?
Funded by
A network of expert partners:
Centre for the Evaluation of Complexity
across the Nexus
Why CECAN?
‘Nexus’ issues - concerning the
nature and interaction of food,
water, energy, climate and
ecosystems - are complex, with
many diverse, interconnected
factors involved. This presents
a major challenge to policy-
making because changing one
factor can often have
unexpected knock-on effects in
seemingly unrelated areas. We
need new ways to evaluate
policy in these situations.
www.cecan.ac.uk
How will CECAN create impact?
…By influencing the practice of evaluation for the
Nexus to make it fit for a complex world through…
Leadership
• Publications e.g. EPPNs
• Connecting people (like in today’s webinar)
Case Studies with Partners/Funders
• Fully embedded e.g. work placement, PhD, fellowships
• Bridge building
Translate or create a new method
• Method development
Fellowships, Doctoral students
• Capacity building
• Bridge building
How CECAN can add value for you?
New tools and support for evidence-based policy-
making
Fresh approaches to appraisal and evaluation
Embraces an 'open research' culture of knowledge
exchange
Events, publications, dialogues, co-designed case
studies
Intends to provide solutions not add to burdens
Find out more via the CECAN animation and our
website
RESEARCH BRIEF
Research objectives
To further explore evaluation experiences and
challenges especially those encountered in
Defra’s Reward & Recognition Scheme (RRF);
To unpack the relationship between evaluation
and policy-making especially looking at the
policy cycle; and
To investigate how complexity is understood
and how it could be useful in policymaking.
Research method
Secondary
review
- Notes from evaluation workshops
- Data on evaluation process from
site visit write-ups of 31 schemes
- Ten practitioner
interviews with RRF
scheme managers or
delivery contractors
- Ten policy stakeholder
interviews with varied
engagement in the RRF
and varied policy
backgrounds
What is the RRF?
Reward and Recognition Scheme (RRF) launched in June 2011
Up to £2 million from 2011 to 2014
A support package for 31 schemes run by civil society
organisations and local authorities
Schemes focused on different: behaviours, geographical
locations, reach, audiences, delivery mechanisms, engagement,
material type, time frames, etc.
All tested the impact of reward and recognition on increasing
recycling and reuse (positive waste behaviours)
A safe space to foster innovative schemes to inform best practice
A process and narrative evaluation using key impact indicators
and case studies was carried out by Brook Lyndhurst – published
report
Programme
Evaluation
What about energy? Is this not too UK
centric?
Feeling short changed?
Don’t! You will feel richer
by the end of the
webinar! 
EXPERIENCE OF EVALUATION
Broad experience of evaluation
Need for black and white data
on how schemes are working
vs much messier practice
Behaviour change takes a long
time to embed, evaluation
needed to show results sooner
Sense of purpose - felt part of
a nation-wide programme
Onerous evaluation demands
Worthwhile to show scheme
effectiveness
Evaluation key to good policy-
making
Clear evaluation objectives
from the onset but need to be
adaptive – an iterative process
‘Evaluation’ as a continuum of
understanding the evidence;
not an isolated, one-off activity
Evaluation is both an external,
accountability exercise and an
internal, reflective dialogue of
what works
Policy isn’t driven by
evaluation outcomes
Practitioner Policy stakeholder
Evaluation journey
Design &
plan
Commission
& detailing
Implement &
analyse
Complete &
Use results
Phases link up in theory but not in
practice
Self-contained phases but
evaluation is/should be done
continuously
Expectation that design & planning
happens in tendering process by
research community rather than
internally
Different evaluation approaches
for different Government
departments/ agencies - ranging
from locally-run, laissez-faire style
to centrally managed, hands-on
style
“What we tend to do is produce
a huge wish list of things we
would like the evaluation to
achieve…Everything else flows
reasonably well except for the
fact that we sometimes change
our minds, so commissioning
implementation etc. will broadly
happen then in the way you
would expect them to do within
consultation. But I think the
thing that’s really missing is that
actually designing and thinking
logically about what we really
want before we actually
commission it."
Policy stakeholder
Practitioner evaluation challenges
Data
Impact &
attribution
Comparison
groups
Policy stakeholder evaluation challenges
A different perspective but similar issues…
Data – quality, availability and relevance
Isolating impact of policy – many intervening factors at
play
Identifying control groups
Limited techniques and methods at their disposal
Lack of clear objectives
Budgets
Variety and diversity of projects
Mismatch of skills and capacity for those carrying out
evaluation
The delivery-evaluation relationship
‘Tug of war’ – a tense but not
polarised relationship with
evaluation taking time, resources
and energy away from delivery
Complementary partnership –
where evaluation feeds into and
is integrated in the project’s
delivery activities
But more often a…
RRF Example: A special case?
Practitioners felt part of a
national call to action to pilot
rewards and recognition in
increasing recycling and reuse
Rationale behind evaluation was
a proof of concept of whether
policy has or hasn’t worked
More hands-on, onerous and
detailed compared to other
funding streams
A lot of intervening factors and
background noise made
attribution to scheme, let alone
reward element difficult, if not
impossible
“So I think it actually was
a benefit knowing that
…we were tying into
other schemes and
contributing into a much
wider intelligence base."
Practitioner
“There was a
responsibility as part of
the funding to also ensure
that we were using that in
a way that it was given to
us and that we were
giving something back for
that. So I was very aware
that it was there but not in
a suffocating way."
Practitioner
So what? Experience of evaluation
Drawing from practice, how can these insights improve policy-making…
Creating a wider sense of purpose in an evaluation can help
nurture buy-in
If evaluation is planned, resourced and budgeted into the
scheme from the onset, the relationship between project
delivery and evaluation need not be a tense ‘tug of war’
Acknowledgement that practitioners and policy stakeholders
experience similar evaluation challenges
Evaluation needs to be an integrative, continuous process not
a one-off standalone activity or a series of self-contained
steps
In the field of environment both scheme/service delivery and
policy development happen in a ‘messy’ context with many
intervening factors at play, this makes attribution difficult
INFORMING POLICY
Practitioners & policy development
Most RRF practitioners acknowledged
that they were part of a policy
development exercise but did not have a
deep understanding of what this meant
Evaluation & policy – not a perfect match
Different speeds – working to two
distinct tempos
∙ Fast paced, dynamic, quick turnaround
of policymaking versus the analytical,
detailed, long timeframe of evaluations
Evaluators/analysts need to feel
comfortable with ‘good enough’ and
‘impact at this point in time’
Policy stakeholders need to feel
comfortable with the risk that end
conclusions may be different
Given the new regulatory regime
with post-implementation reviews
and future policy reform, evaluation
is more important
"Just in the example of Reward and
Recognition by the time that evaluation
was done, signed off and published it
wasn’t really on anyone’s agenda
anymore, it wasn’t topical, it took too
long to do the evaluation but it needed
that long because you have to pilot it,
you have to evaluate it, you had to
write it up. So it is a big dilemma.”
Policy stakeholder
Familiarity with policy cycle
Source: Defra (undated) Inside Defra: How Defra works? p. 11
http://www.larpnet.com/downloads/insidedefra.pdf
“I think it’s just how things are done I
don’t think people even think
consciously ‘oh no, I’m going to start
using the policy [cycle]’ if you see what I
mean? That is just business as usual.”
Policy stakeholder
High levels of familiarity
Mainly applied to new policies
A discursive, process tool
Justification of Department’s work
Don’t ‘use’ it, it just ‘is’ – part of
standard operating procedure
Good in theory, not in practice
Wheel makes it look cyclical, but it
effectively describes a liner process
“ Although it goes round in a circle, it is still
essentially describing a nice neat linear
process which doesn’t exist in the real
world. I think, also, it doesn’t demonstrate
how evidence is used throughout that
cycle.”
Policy stakeholder
“An ideal model that never actually
happens in practice.”
Policy stakeholder
E&E
E&E
E&E
E&EE&E
E&E
Complete
formal
evaluation
How does/should evaluation fit into the
policy cycle?
∞
E&E = Evidence gathering & preparation for evaluation
= Timely input = Policy makers working with analysts
Key
“So I think evaluation, it almost
shouldn't be at every step it
should be all the way through
without being a step, does that
make sense? It should be a way
of working."
Policy stakeholder
“I think the most useful message
for the Policy Cycle it’s almost
never too late to insert ...
evaluation thinking.”
Policy stakeholder
Challenges for embedding evaluation
into the policy cycle
Policies not being ‘evaluable’
Lack of time Policy changing over time Tempo mismatch
Working culture
Senior management buy-in Political will
Other more common challenges:
Capacity and capabilities
Data
Resources and costs
Interrelated systems
RRF Example: Impact on policy
The results and process of the RRF led to impact on…
Social research:
∙ Funding of action-based research projects
∙ Useful insight on how to set-up schemes
∙ Rich learning documented and shared amongst analysts
Policy
∙ Informs current thinking on levers of behaviour change
∙ Used in different policy circles across waste streams
∙ Rewards not considered as a measure to increase
recycling
On the ground practice
∙ Report made available to other schemes
∙ Dissuaded some local areas to take up rewards
∙ A few practitioners felt RRF left a legacy in their local
communities, overall jury is still out
So what? Informing policy
How can evaluation be better integrated into policy-making…
Acknowledge the time scale disparity and work with ‘good
enough’ and ‘at the time’ insight
Closer collaboration between policy makers and analysts
Ensure policy cycle is an actual a way of working
Evaluation to form part of initial thinking
‘Preparing for evaluation’ to feed into each phase – not an
additional burden
Evaluation cannot delay or derail policy development, needs
to complement it
UNDERSTANDING COMPLEXITY
Defining complexity
Defining complexity in the RRF
If policy issue is complex,
evaluation doesn’t have to be
Understanding behaviour
change always complex
Lack of transferability or
replicability of a scheme an
indicator of complexity
Background noise makes it
hard to isolate impact
New, innovative areas
Controversial policy/ issue
Unintended benefits/
consequences
Different impact across the
same audience
Challenge is complicated,
while the system with its
intervening factors, interactions
and trade-offs is complex
Simple concept – ‘rewarding
people’ but context is complex
Cannot visualise impact when
it comes to waste
Complicate vs complex –
around predictability, lack of
control (esp. of externalities)
Practitioner Policy stakeholder
RRF Example: What makes it complex?
“The evaluation doesn’t need to
be complex at all if you actually
know what your goals are.
Complex projects are always
going to be around, we are
never going to simplify it, but it
is [about] how you develop your
evaluation protocols.”
Practitioner
“Comparing different schemes it’s
very difficult to transfer a scheme
from one area to another in those
terms. Sometimes communities can
vary wildly from area to area so you
might find your trial area works very
well but if you transfer that even to a
community that’s next to it ,it might
be very difficult to replicate the
results..”
Practitioner
" I think it is complex because it involves a
diffusion of different people in different
situations with different motivations and
different needs facing different physical
barriers, motivational barriers, financial
barriers or situations maybe better than
financial barriers. So having a policy that
influences all of those people to do the same
thing in the same way to the same extent is
obviously unachievable and therefore there
must be complexity in the policy solution to
that problem.”
Policy stakeholder
“But also you’re dealing with quite complex
systems where there’s lots of interactions,
there’s lots of trade offs and things like that
that can be quite challenging. And also
you’re dealing with complex human
behaviours."
Policy stakeholder
A closer look at complexity: the nuances
Contextual specificity, attribution difficulties and
background noise resonated well
Perspective of complexity – evaluation, policy,
issue, scheme concept, etc.
A question of framing – an issue is complex but a
policy or evaluation doesn’t have to be complex
Interrelated systems make causality difficult
Need to look beyond the intended outcomes
Complexity: what’s in a name?
‘Complex’ issues not necessarily formally recognised as
such
Term not considered off-putting or negative but some
said ‘complex’ label may deter pursuit of certain policies
Any assessment of complexity needs to be integrated in
existing appraisal mechanisms and framed as an
opportunity
Complexity & evaluation methods
Does/ should complexity affect the
type of evaluation carried out?
∙ 2 No; 5 Yes; 3 Don’t know
Some appreciation that complexity
precludes certain evaluation methods
Complexity not the only issue – can’t
lose sight of the bigger picture
Not helpful to cluster policies under
different headings or techniques
“I don’t agree with the “What works” centres, I
think they are fundamentally flawed because
even if you can do a really robust RCT type
evaluation all that will tell you is it worked in that
context at that particular time, delivered in that
particular way, and we know from our experience
that you don’t have to deviate very much from
the delivery model to get a completely different
result.”
Policy stakeholder
““Yes it might be that for the particular
complexity that it’s just not possible to
use one of those research designs…If
you’ve got an area where you’ve just
got a lot of different policies working
then actually measuring the precise
impact that each one has had rather
than understanding the cumulative
impact can be quite tricky.”
Policy stakeholder
“"I think that’s why CECAN was
set up, isn’t it, in the sense that
we recognise that actually our
ability to evaluate these sorts of
things is not particularly great,
and I think that we are hoping for
insights into how to do it better.
So no I don’t think it is, I think it is
recognised as an issue, but I
don’t think that currently we are
particularly good at doing it.“
Policy stakeholder
So what? Understanding complexity
How can understanding complexity better inform evaluation and policy-
making…
Complexity can be that common trait across policy issues that
have governance issues, that are interrelated, for which
impacts are difficult to measure and attribute
Context, attribution and background noise were aspects that
resonated well with interviews when discussing complexity
Label of ‘complexity’ isn’t that important but framing is – an
opportunity
Acknowledging complexity overtly and, perhaps, formally can
help with evaluation and thus improving policy-making
Any assessment of a policy’s complexity has to be integrated
in existing appraisal mechanisms, no appetite for another
process
CONCLUDING INSIGHTS
Concluding remarks
Evaluation needs to be an integrative, continuous
process not a one-off exercise at the end or a
series of self-contained steps – a way of working
Acknowledge the time scale disparity between
policy and evaluation, use ‘good enough’ and ‘at
the time’ insight and embed ‘preparing for
evaluation’ especially in initial policy design phase
Recognising complexity explicitly can better equip
policy stakeholders and practitioners with the
‘smart’ evaluation approaches
CECAN can help further all three points…
Q&A AND DISCUSSION
Webinar - How to improve the evaluation of complex systems?
Questions to discuss
EVALUATION: How can your own experience of
evaluation and its challenges help improve policy-
making? What is your experience of evaluation?
Does it chime with the research insights?
POLICY: How can evaluation be better integrated
into policy-making? What is your experience of
evaluation informing or not informing policy? Does
it chime with the research insights?
COMPLEXITY: What is your understanding and
experience of complexity? Does it chime with the
research insights?
sara.giorgi@brooklyndhurst.co.uk
@energyeval
@cecanexus
www.cecan.ac.uk/

More Related Content

PDF
Subject: Ex-post impact evaluations of energy efficiency policies in Europe
PDF
Using human-centred design to improve energy efficiency programs
PDF
Applying behavioural insights to demand side energy policies and programmes: ...
PDF
What can entrepreneurs and their business models contribute to accelerating t...
PDF
Regulatory Sandboxes in the Energy Sector | DSM University
PDF
Electric vehicle grid integration policies to benefit consumers
PDF
Overview of the FlexPlan project. Focus on EU regulatory analysis and TSO-DSO...
PDF
Quis custodiet ipsos custodes? The EU's energy efficiency policies scrutinise...
Subject: Ex-post impact evaluations of energy efficiency policies in Europe
Using human-centred design to improve energy efficiency programs
Applying behavioural insights to demand side energy policies and programmes: ...
What can entrepreneurs and their business models contribute to accelerating t...
Regulatory Sandboxes in the Energy Sector | DSM University
Electric vehicle grid integration policies to benefit consumers
Overview of the FlexPlan project. Focus on EU regulatory analysis and TSO-DSO...
Quis custodiet ipsos custodes? The EU's energy efficiency policies scrutinise...

What's hot (20)

PDF
Electricity Supply Systems of the Future
PDF
Evaluating the UK’s Energy Savings Opportunity Scheme
PDF
Dominion Energy Efficiency Collaborative Presentation
PDF
Cost concepts around operational practices and flexibility investments for hi...
PDF
How to reach the hard-to-reach (energy users)?
PPTX
2017 Strategic Directions: Electric Industry Survey
PDF
Introducing the Global Observatory on Peer-to-Peer, Community Self-Consumpti...
PDF
Time to step up performance-based energy efficiency measurement and verificat...
PDF
IRENA CEM Campaign and Innovation
PDF
Peer-to-Peer energy trading and community self-consumption
PDF
Taking Stock – 40 years of Industrial Energy Audits
PPTX
The Wilton E. Scott Institute for Energy Innovation at Carnegie Mellon Univer...
PDF
Smart, Energy-Saving Homes: What's Stopping Us?
PPTX
The Wilton E. Scott Institute for Energy Innovation at Carnegie Mellon Univer...
PDF
The Integrated Grid epri
PDF
More Than Smart Overview Volume 1
PDF
The implementation of Efficient Management of Electrical Energy Regulations 2008
PDF
IEA smart energy systems roadmap introductory remarks
PDF
Recommendation re proactive approach
PPTX
20160621 T Deora Presentation for FTC
Electricity Supply Systems of the Future
Evaluating the UK’s Energy Savings Opportunity Scheme
Dominion Energy Efficiency Collaborative Presentation
Cost concepts around operational practices and flexibility investments for hi...
How to reach the hard-to-reach (energy users)?
2017 Strategic Directions: Electric Industry Survey
Introducing the Global Observatory on Peer-to-Peer, Community Self-Consumpti...
Time to step up performance-based energy efficiency measurement and verificat...
IRENA CEM Campaign and Innovation
Peer-to-Peer energy trading and community self-consumption
Taking Stock – 40 years of Industrial Energy Audits
The Wilton E. Scott Institute for Energy Innovation at Carnegie Mellon Univer...
Smart, Energy-Saving Homes: What's Stopping Us?
The Wilton E. Scott Institute for Energy Innovation at Carnegie Mellon Univer...
The Integrated Grid epri
More Than Smart Overview Volume 1
The implementation of Efficient Management of Electrical Energy Regulations 2008
IEA smart energy systems roadmap introductory remarks
Recommendation re proactive approach
20160621 T Deora Presentation for FTC
Ad

Similar to Webinar - How to improve the evaluation of complex systems? (20)

PPTX
Evaluation training for wellcome trust 15th may
PPT
Chapter 9.ppt- Evaluating public policies
PDF
How to evaluate in complex and adverse settings
PDF
Introduction to Evaluation and the role of IEPPEC
DOCX
Evaluation Research and Policy AnalysisChapter 11.docx
DOCX
Evaluation Research and Policy AnalysisChapter 11.docx
PDF
Evaluation approaches presented by hari bhusal
PPTX
SCC2011 - Evaluation: Facing the tricky questions
PPT
Lines Of Argument Presentation at Insights to Impact Meeting
PDF
"Assessing Outcomes in CGIAR: Practical Approaches and Methods"
PPT
Street Jibe Evaluation
PDF
Demystifying Evaluation Practical Approaches For Researchers And Users David ...
PPT
Street Jibe Evaluation Workshop 2
PPTX
Monitoring and Evaluation of welfare projects
PDF
Evaluation a systematic approach-Rossi-Lipsey-Freeman
PDF
Evaluation In Action Interviews With Expert Evaluators Paperback Jody L Fitzp...
PPTX
Importance of monitoring and evaluation of welfare projects.pptx
PPS
The value of engagement
PPTX
National health program evaluation
PPTX
ESRC Evaluation strategy
Evaluation training for wellcome trust 15th may
Chapter 9.ppt- Evaluating public policies
How to evaluate in complex and adverse settings
Introduction to Evaluation and the role of IEPPEC
Evaluation Research and Policy AnalysisChapter 11.docx
Evaluation Research and Policy AnalysisChapter 11.docx
Evaluation approaches presented by hari bhusal
SCC2011 - Evaluation: Facing the tricky questions
Lines Of Argument Presentation at Insights to Impact Meeting
"Assessing Outcomes in CGIAR: Practical Approaches and Methods"
Street Jibe Evaluation
Demystifying Evaluation Practical Approaches For Researchers And Users David ...
Street Jibe Evaluation Workshop 2
Monitoring and Evaluation of welfare projects
Evaluation a systematic approach-Rossi-Lipsey-Freeman
Evaluation In Action Interviews With Expert Evaluators Paperback Jody L Fitzp...
Importance of monitoring and evaluation of welfare projects.pptx
The value of engagement
National health program evaluation
ESRC Evaluation strategy
Ad

More from Leonardo ENERGY (20)

PDF
A new generation of instruments and tools to monitor buildings performance
PDF
Addressing the Energy Efficiency First Principle in a National Energy and Cli...
PDF
Auctions for energy efficiency and the experience of renewables
PDF
Energy efficiency first – retrofitting the building stock final
PDF
How auction design affects the financing of renewable energy projects
PDF
Energy Efficiency Funds in Europe (updated)
PDF
Energy Efficiency Funds in Europe
PDF
Five actions fit for 55: streamlining energy savings calculations
PDF
Recent energy efficiency trends in the EU
PDF
Energy and mobility poverty: Will the Social Climate Fund be enough to delive...
PDF
Does the EU Emission Trading Scheme ETS Promote Energy Efficiency?
PPTX
Energy efficiency, structural change and energy savings in the manufacturing ...
PPTX
Energy Sufficiency Indicators and Policies (Lea Gynther, Motiva)
PDF
The Super-efficient Equipment and Appliance Deployment (SEAD) Initiative Prod...
PDF
Modelling and optimisation of electric motors with hairpin windings
PDF
Casting zero porosity rotors
PDF
Direct coil cooling through hollow wire
PDF
Motor renovation - Potential savings and views from various EU Member States
PDF
The need for an updated European Motor Study - key findings from the 2021 US...
PDF
Efficient motor systems for a Net Zero world, by Conrad U. Brunner - Impact E...
A new generation of instruments and tools to monitor buildings performance
Addressing the Energy Efficiency First Principle in a National Energy and Cli...
Auctions for energy efficiency and the experience of renewables
Energy efficiency first – retrofitting the building stock final
How auction design affects the financing of renewable energy projects
Energy Efficiency Funds in Europe (updated)
Energy Efficiency Funds in Europe
Five actions fit for 55: streamlining energy savings calculations
Recent energy efficiency trends in the EU
Energy and mobility poverty: Will the Social Climate Fund be enough to delive...
Does the EU Emission Trading Scheme ETS Promote Energy Efficiency?
Energy efficiency, structural change and energy savings in the manufacturing ...
Energy Sufficiency Indicators and Policies (Lea Gynther, Motiva)
The Super-efficient Equipment and Appliance Deployment (SEAD) Initiative Prod...
Modelling and optimisation of electric motors with hairpin windings
Casting zero porosity rotors
Direct coil cooling through hollow wire
Motor renovation - Potential savings and views from various EU Member States
The need for an updated European Motor Study - key findings from the 2021 US...
Efficient motor systems for a Net Zero world, by Conrad U. Brunner - Impact E...

Recently uploaded (20)

PPTX
Developing_An_Advocacy_Agenda_by_Kevin_Karuga.pptx
PPTX
Chapter 1: Philippines constitution laws
PPTX
BHARATIYA NAGARIKA SURAKSHA SAHMITA^J2023 (1).pptx
PPTX
cpgram enivaran cpgram enivaran cpgram enivaran
PPTX
Core Humanitarian Standard Presentation by Abraham Lebeza
PPTX
LUNG CANCER PREDICTION MODELING USING ARTIFICIAL NEURAL NETWORK.pptx
PPTX
Introduction to the NAP Process and NAP Global Network
PDF
PPT Item # 10 -- Proposed 2025 Tax Rate
PPTX
Parliamentary procedure in meeting that can be use
PPTX
Workshop introduction and objectives. SK.pptx
PPTX
CHS rollout Presentation by Abraham Lebeza.pptx
PPTX
Water-Energy-Food (WEF) Nexus interventions, policy, and action in the MENA r...
PPTX
Neurons.pptx and the family in London are you chatgpt
PDF
Abhay Bhutada Foundation’s ESG Compliant Initiatives
PPTX
True Fruits_ reportcccccccccccccccc.pptx
PDF
4_Key Concepts Structure and Governance plus UN.pdf okay
PPT
The Central Civil Services (Leave Travel Concession) Rules, 1988, govern the ...
PDF
UNEP/ UNEA Plastic Treaty Negotiations Report of Inc 5.2 Geneva
PDF
Introducrion of creative nonfiction lesson 1
PPTX
ANALYSIS OF THE PROCLAMATION OF THE PHILIPPHINE INDEPENDENCE.pptx
Developing_An_Advocacy_Agenda_by_Kevin_Karuga.pptx
Chapter 1: Philippines constitution laws
BHARATIYA NAGARIKA SURAKSHA SAHMITA^J2023 (1).pptx
cpgram enivaran cpgram enivaran cpgram enivaran
Core Humanitarian Standard Presentation by Abraham Lebeza
LUNG CANCER PREDICTION MODELING USING ARTIFICIAL NEURAL NETWORK.pptx
Introduction to the NAP Process and NAP Global Network
PPT Item # 10 -- Proposed 2025 Tax Rate
Parliamentary procedure in meeting that can be use
Workshop introduction and objectives. SK.pptx
CHS rollout Presentation by Abraham Lebeza.pptx
Water-Energy-Food (WEF) Nexus interventions, policy, and action in the MENA r...
Neurons.pptx and the family in London are you chatgpt
Abhay Bhutada Foundation’s ESG Compliant Initiatives
True Fruits_ reportcccccccccccccccc.pptx
4_Key Concepts Structure and Governance plus UN.pdf okay
The Central Civil Services (Leave Travel Concession) Rules, 1988, govern the ...
UNEP/ UNEA Plastic Treaty Negotiations Report of Inc 5.2 Geneva
Introducrion of creative nonfiction lesson 1
ANALYSIS OF THE PROCLAMATION OF THE PHILIPPHINE INDEPENDENCE.pptx

Webinar - How to improve the evaluation of complex systems?

  • 1. Evaluation, Policy & Complexity How to improve the evaluation of complex systems to better inform policy-making: Learning from evaluating Defra’s Reward & Recognition Fund Energy Evaluation Academy Webinar– June 26th 2019 Sara Giorgi, Research Consultant & Evaluator
  • 2. Presentation outline Aims of webinar What is CECAN? Research brief – introducing Defra’s RRF Experience of evaluation Informing policy Understanding complexity Concluding insights Q&A and discussion
  • 4. Aims of webinar: What will you gain from the session? 1. First to hear the research findings 2. Constructive dialogue, and a knowledge and practice exchange ∙ Exploring: experience of evaluation, informing policy and understanding complexity 3. Find out about CECAN
  • 5. Aims of webinar What do you expect to get out of this session?
  • 7. What is CECAN? A £3m UK research centre hosted by the University of Surrey, bringing together experts to address some of the big issues in policy-making and evaluation CECAN pioneers, tests and promotes innovative evaluation approaches and methods across policy areas where food, energy, water and environment intersect
  • 8. Who is CECAN? Funded by A network of expert partners: Centre for the Evaluation of Complexity across the Nexus
  • 9. Why CECAN? ‘Nexus’ issues - concerning the nature and interaction of food, water, energy, climate and ecosystems - are complex, with many diverse, interconnected factors involved. This presents a major challenge to policy- making because changing one factor can often have unexpected knock-on effects in seemingly unrelated areas. We need new ways to evaluate policy in these situations. www.cecan.ac.uk
  • 10. How will CECAN create impact? …By influencing the practice of evaluation for the Nexus to make it fit for a complex world through… Leadership • Publications e.g. EPPNs • Connecting people (like in today’s webinar) Case Studies with Partners/Funders • Fully embedded e.g. work placement, PhD, fellowships • Bridge building Translate or create a new method • Method development Fellowships, Doctoral students • Capacity building • Bridge building
  • 11. How CECAN can add value for you? New tools and support for evidence-based policy- making Fresh approaches to appraisal and evaluation Embraces an 'open research' culture of knowledge exchange Events, publications, dialogues, co-designed case studies Intends to provide solutions not add to burdens Find out more via the CECAN animation and our website
  • 13. Research objectives To further explore evaluation experiences and challenges especially those encountered in Defra’s Reward & Recognition Scheme (RRF); To unpack the relationship between evaluation and policy-making especially looking at the policy cycle; and To investigate how complexity is understood and how it could be useful in policymaking.
  • 14. Research method Secondary review - Notes from evaluation workshops - Data on evaluation process from site visit write-ups of 31 schemes - Ten practitioner interviews with RRF scheme managers or delivery contractors - Ten policy stakeholder interviews with varied engagement in the RRF and varied policy backgrounds
  • 15. What is the RRF? Reward and Recognition Scheme (RRF) launched in June 2011 Up to £2 million from 2011 to 2014 A support package for 31 schemes run by civil society organisations and local authorities Schemes focused on different: behaviours, geographical locations, reach, audiences, delivery mechanisms, engagement, material type, time frames, etc. All tested the impact of reward and recognition on increasing recycling and reuse (positive waste behaviours) A safe space to foster innovative schemes to inform best practice A process and narrative evaluation using key impact indicators and case studies was carried out by Brook Lyndhurst – published report Programme Evaluation
  • 16. What about energy? Is this not too UK centric? Feeling short changed? Don’t! You will feel richer by the end of the webinar! 
  • 18. Broad experience of evaluation Need for black and white data on how schemes are working vs much messier practice Behaviour change takes a long time to embed, evaluation needed to show results sooner Sense of purpose - felt part of a nation-wide programme Onerous evaluation demands Worthwhile to show scheme effectiveness Evaluation key to good policy- making Clear evaluation objectives from the onset but need to be adaptive – an iterative process ‘Evaluation’ as a continuum of understanding the evidence; not an isolated, one-off activity Evaluation is both an external, accountability exercise and an internal, reflective dialogue of what works Policy isn’t driven by evaluation outcomes Practitioner Policy stakeholder
  • 19. Evaluation journey Design & plan Commission & detailing Implement & analyse Complete & Use results Phases link up in theory but not in practice Self-contained phases but evaluation is/should be done continuously Expectation that design & planning happens in tendering process by research community rather than internally Different evaluation approaches for different Government departments/ agencies - ranging from locally-run, laissez-faire style to centrally managed, hands-on style “What we tend to do is produce a huge wish list of things we would like the evaluation to achieve…Everything else flows reasonably well except for the fact that we sometimes change our minds, so commissioning implementation etc. will broadly happen then in the way you would expect them to do within consultation. But I think the thing that’s really missing is that actually designing and thinking logically about what we really want before we actually commission it." Policy stakeholder
  • 20. Practitioner evaluation challenges Data Impact & attribution Comparison groups
  • 21. Policy stakeholder evaluation challenges A different perspective but similar issues… Data – quality, availability and relevance Isolating impact of policy – many intervening factors at play Identifying control groups Limited techniques and methods at their disposal Lack of clear objectives Budgets Variety and diversity of projects Mismatch of skills and capacity for those carrying out evaluation
  • 22. The delivery-evaluation relationship ‘Tug of war’ – a tense but not polarised relationship with evaluation taking time, resources and energy away from delivery Complementary partnership – where evaluation feeds into and is integrated in the project’s delivery activities But more often a…
  • 23. RRF Example: A special case? Practitioners felt part of a national call to action to pilot rewards and recognition in increasing recycling and reuse Rationale behind evaluation was a proof of concept of whether policy has or hasn’t worked More hands-on, onerous and detailed compared to other funding streams A lot of intervening factors and background noise made attribution to scheme, let alone reward element difficult, if not impossible “So I think it actually was a benefit knowing that …we were tying into other schemes and contributing into a much wider intelligence base." Practitioner “There was a responsibility as part of the funding to also ensure that we were using that in a way that it was given to us and that we were giving something back for that. So I was very aware that it was there but not in a suffocating way." Practitioner
  • 24. So what? Experience of evaluation Drawing from practice, how can these insights improve policy-making… Creating a wider sense of purpose in an evaluation can help nurture buy-in If evaluation is planned, resourced and budgeted into the scheme from the onset, the relationship between project delivery and evaluation need not be a tense ‘tug of war’ Acknowledgement that practitioners and policy stakeholders experience similar evaluation challenges Evaluation needs to be an integrative, continuous process not a one-off standalone activity or a series of self-contained steps In the field of environment both scheme/service delivery and policy development happen in a ‘messy’ context with many intervening factors at play, this makes attribution difficult
  • 26. Practitioners & policy development Most RRF practitioners acknowledged that they were part of a policy development exercise but did not have a deep understanding of what this meant
  • 27. Evaluation & policy – not a perfect match Different speeds – working to two distinct tempos ∙ Fast paced, dynamic, quick turnaround of policymaking versus the analytical, detailed, long timeframe of evaluations Evaluators/analysts need to feel comfortable with ‘good enough’ and ‘impact at this point in time’ Policy stakeholders need to feel comfortable with the risk that end conclusions may be different Given the new regulatory regime with post-implementation reviews and future policy reform, evaluation is more important "Just in the example of Reward and Recognition by the time that evaluation was done, signed off and published it wasn’t really on anyone’s agenda anymore, it wasn’t topical, it took too long to do the evaluation but it needed that long because you have to pilot it, you have to evaluate it, you had to write it up. So it is a big dilemma.” Policy stakeholder
  • 28. Familiarity with policy cycle Source: Defra (undated) Inside Defra: How Defra works? p. 11 http://www.larpnet.com/downloads/insidedefra.pdf “I think it’s just how things are done I don’t think people even think consciously ‘oh no, I’m going to start using the policy [cycle]’ if you see what I mean? That is just business as usual.” Policy stakeholder High levels of familiarity Mainly applied to new policies A discursive, process tool Justification of Department’s work Don’t ‘use’ it, it just ‘is’ – part of standard operating procedure Good in theory, not in practice Wheel makes it look cyclical, but it effectively describes a liner process “ Although it goes round in a circle, it is still essentially describing a nice neat linear process which doesn’t exist in the real world. I think, also, it doesn’t demonstrate how evidence is used throughout that cycle.” Policy stakeholder “An ideal model that never actually happens in practice.” Policy stakeholder
  • 29. E&E E&E E&E E&EE&E E&E Complete formal evaluation How does/should evaluation fit into the policy cycle? ∞ E&E = Evidence gathering & preparation for evaluation = Timely input = Policy makers working with analysts Key “So I think evaluation, it almost shouldn't be at every step it should be all the way through without being a step, does that make sense? It should be a way of working." Policy stakeholder “I think the most useful message for the Policy Cycle it’s almost never too late to insert ... evaluation thinking.” Policy stakeholder
  • 30. Challenges for embedding evaluation into the policy cycle Policies not being ‘evaluable’ Lack of time Policy changing over time Tempo mismatch Working culture Senior management buy-in Political will Other more common challenges: Capacity and capabilities Data Resources and costs Interrelated systems
  • 31. RRF Example: Impact on policy The results and process of the RRF led to impact on… Social research: ∙ Funding of action-based research projects ∙ Useful insight on how to set-up schemes ∙ Rich learning documented and shared amongst analysts Policy ∙ Informs current thinking on levers of behaviour change ∙ Used in different policy circles across waste streams ∙ Rewards not considered as a measure to increase recycling On the ground practice ∙ Report made available to other schemes ∙ Dissuaded some local areas to take up rewards ∙ A few practitioners felt RRF left a legacy in their local communities, overall jury is still out
  • 32. So what? Informing policy How can evaluation be better integrated into policy-making… Acknowledge the time scale disparity and work with ‘good enough’ and ‘at the time’ insight Closer collaboration between policy makers and analysts Ensure policy cycle is an actual a way of working Evaluation to form part of initial thinking ‘Preparing for evaluation’ to feed into each phase – not an additional burden Evaluation cannot delay or derail policy development, needs to complement it
  • 35. Defining complexity in the RRF If policy issue is complex, evaluation doesn’t have to be Understanding behaviour change always complex Lack of transferability or replicability of a scheme an indicator of complexity Background noise makes it hard to isolate impact New, innovative areas Controversial policy/ issue Unintended benefits/ consequences Different impact across the same audience Challenge is complicated, while the system with its intervening factors, interactions and trade-offs is complex Simple concept – ‘rewarding people’ but context is complex Cannot visualise impact when it comes to waste Complicate vs complex – around predictability, lack of control (esp. of externalities) Practitioner Policy stakeholder
  • 36. RRF Example: What makes it complex? “The evaluation doesn’t need to be complex at all if you actually know what your goals are. Complex projects are always going to be around, we are never going to simplify it, but it is [about] how you develop your evaluation protocols.” Practitioner “Comparing different schemes it’s very difficult to transfer a scheme from one area to another in those terms. Sometimes communities can vary wildly from area to area so you might find your trial area works very well but if you transfer that even to a community that’s next to it ,it might be very difficult to replicate the results..” Practitioner " I think it is complex because it involves a diffusion of different people in different situations with different motivations and different needs facing different physical barriers, motivational barriers, financial barriers or situations maybe better than financial barriers. So having a policy that influences all of those people to do the same thing in the same way to the same extent is obviously unachievable and therefore there must be complexity in the policy solution to that problem.” Policy stakeholder “But also you’re dealing with quite complex systems where there’s lots of interactions, there’s lots of trade offs and things like that that can be quite challenging. And also you’re dealing with complex human behaviours." Policy stakeholder
  • 37. A closer look at complexity: the nuances Contextual specificity, attribution difficulties and background noise resonated well Perspective of complexity – evaluation, policy, issue, scheme concept, etc. A question of framing – an issue is complex but a policy or evaluation doesn’t have to be complex Interrelated systems make causality difficult Need to look beyond the intended outcomes
  • 38. Complexity: what’s in a name? ‘Complex’ issues not necessarily formally recognised as such Term not considered off-putting or negative but some said ‘complex’ label may deter pursuit of certain policies Any assessment of complexity needs to be integrated in existing appraisal mechanisms and framed as an opportunity
  • 39. Complexity & evaluation methods Does/ should complexity affect the type of evaluation carried out? ∙ 2 No; 5 Yes; 3 Don’t know Some appreciation that complexity precludes certain evaluation methods Complexity not the only issue – can’t lose sight of the bigger picture Not helpful to cluster policies under different headings or techniques “I don’t agree with the “What works” centres, I think they are fundamentally flawed because even if you can do a really robust RCT type evaluation all that will tell you is it worked in that context at that particular time, delivered in that particular way, and we know from our experience that you don’t have to deviate very much from the delivery model to get a completely different result.” Policy stakeholder ““Yes it might be that for the particular complexity that it’s just not possible to use one of those research designs…If you’ve got an area where you’ve just got a lot of different policies working then actually measuring the precise impact that each one has had rather than understanding the cumulative impact can be quite tricky.” Policy stakeholder “"I think that’s why CECAN was set up, isn’t it, in the sense that we recognise that actually our ability to evaluate these sorts of things is not particularly great, and I think that we are hoping for insights into how to do it better. So no I don’t think it is, I think it is recognised as an issue, but I don’t think that currently we are particularly good at doing it.“ Policy stakeholder
  • 40. So what? Understanding complexity How can understanding complexity better inform evaluation and policy- making… Complexity can be that common trait across policy issues that have governance issues, that are interrelated, for which impacts are difficult to measure and attribute Context, attribution and background noise were aspects that resonated well with interviews when discussing complexity Label of ‘complexity’ isn’t that important but framing is – an opportunity Acknowledging complexity overtly and, perhaps, formally can help with evaluation and thus improving policy-making Any assessment of a policy’s complexity has to be integrated in existing appraisal mechanisms, no appetite for another process
  • 42. Concluding remarks Evaluation needs to be an integrative, continuous process not a one-off exercise at the end or a series of self-contained steps – a way of working Acknowledge the time scale disparity between policy and evaluation, use ‘good enough’ and ‘at the time’ insight and embed ‘preparing for evaluation’ especially in initial policy design phase Recognising complexity explicitly can better equip policy stakeholders and practitioners with the ‘smart’ evaluation approaches CECAN can help further all three points…
  • 45. Questions to discuss EVALUATION: How can your own experience of evaluation and its challenges help improve policy- making? What is your experience of evaluation? Does it chime with the research insights? POLICY: How can evaluation be better integrated into policy-making? What is your experience of evaluation informing or not informing policy? Does it chime with the research insights? COMPLEXITY: What is your understanding and experience of complexity? Does it chime with the research insights?