SlideShare a Scribd company logo
27:03:2013
Set-theoretic, diagnostic
and Bayesian approaches
to impact evidence
Barbara Befani
Impact, Innovation and Learning: Towards a Research and Practice Agenda for the Future
Brighton, 26-27 March 2013
Outline
Set-theoretic Methods (e.g. QCA) and the new
challenges
– Uncertainty (equifinality)
– Causal contribution (multiple-conjunctural
causality)
– Causal asymmetry (necessity and sufficiency)
Diagnostic and Bayesian approaches
– Uncertainty (can be quantified with probabilities)
– The strength of qualitative evidence can be
measured
Defining & explaining events with Set Theory
In uncertain and emergent contexts, we cannot
define “impact” (or success) precisely
An Impact “space” of possible events, all desirable
– All compatible with given values and goals
Success is likely to look like ANY of a number of
events = a LOGICAL UNION
Success looks more like “being on the right track”
than achieving a specific goal
Being “on the right track” means avoiding a number
of pitfalls / dead ends
Sets can be defined as NEGATION of other sets
The three main operators in set theory are
– Negation, union, intersection
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Paper session 4 Barbara Befani
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Paper session 4 Barbara Befani
Causal Asymmetry and Contribution Analysis
What is a sufficient causal package (a branch of
blue nodes)
Principal Contributory Cause = INUS, a necessary
part of the (sufficient) combination (each blue node
of a given branch)
In Set Theory terminology, a causal package is an
INTERSECTION of contributory causes
A combination of necessary causes (necessary
within that causal package)
Set Theory provides the mathematical basis for
1. analyzing causal contribution
2. dealing with uncertainty (particularly Fuzzy Sets)
UNION of A and B
A U B
A OR B
A + B
INTERSECTION of A and B
A ∩ B
A AND B
A * B
A A
B B
NEGATION of A
=
~A, NOT A
Conf
ID
Cold Chain
Integrity
(necessary)
Vaccine (V)
(intervention)
Health
System
Quality
(HSQ)
Success
(S)
(reduction in
specific
morbidity)
No.
Cases
A 1 1 1 1 3
B 1 1 0 1 1
C 1 0 1 1 2
D 1 0 0 0 2
E 0 1 1 1/0 0
F 0 1 0 1/0 0
G 0 0 1 1/0 0
H 0 0 0 1/0 0
Why are Causal Combinations important?
Impact is contingent on the context
Finding ONE counterfactual is not enough
QCA helps finding many counterfactuals through systematic cross-
case comparison
Probability and Diagnosis: what the evidence
says
Realm of “unknown knowns”
General problem of the strength / quality of
evidence: how to assess it?
In clinical practice, physicians use tests
– Specificity
• Probability that absence of the disease will
return negative evidence on that test
– Sensitivity
• Probability that presence of the disease will
return positive evidence on that test
– (Positive) Predictive Power
• Probability that positive evidence signals
presence of the disease
When is evidence strong?
When it is sensitive and specific
– Sensitive: P ( Evidence | Impact ) high
– Specific: <= P ( Evidence ) low
• false positives are low
– Predictive: P ( Impact | Evidence ) high
– The latter can be calculated with the Bayes
formula P ( I | E ) = P ( I ) * P ( E | I ) / P ( E )
Two important principles of high-quality evidence
– of all kinds, quali, quanti, etc.
Evidence is strong when:
– The prior probability of observing positive
evidence P ( E ) is LOW (~specificity)
– The probability of observing positive evidence IF
the intervention was successful / had an impact
P ( E | I ) is HIGH (sensitivity)
Impact ‘I’
Predictive
Evidence
Weak
Evidence
Strong
Evidence
Sensitive
Evidence
Specific
Evidence
Evidence ‘E’ (of ‘I’)
REALITY about success of
intervention
Intervention
Successful (I)
Intervention
Unsuccessful (~I)
EVIDENCE about
success of
intervention
Evidence
Positive (E) True Positive False Positive
Evidence
Negative (~E) False Negative True Negative
Seeking evidence of impact: diagnostic tests
REALITY about success of
intervention
Intervention
Successful (I)
Intervention
UNsuccessful
(~I)
EVIDENCE
about success
of intervention
Evidence
Positive (E) True Positive False Positive
Positive
predictive
value =
Σ True Positive
Σ Evidence Posi
tive
Evidence
Negative (~E) False Negative True Negative
Negative
predictive
value =
Σ True
Negative
Σ Evidence Neg
ative
Sensitivity =
Σ True Positive
Σ Intervention
Successful
Specificity =
Σ True
Negative
Σ Intervention
UNsuccessful
When articulated ToCs explaining impact are
supported by evidence IT IS STRONG
EVIDENCE OF IMPACT
The prior probability of observing a sophisticated ToC
with several components is LOW, because
The probability of a combination is the product of the
probability of components (a very SMALL #)
P (a, b, c, ... N ) = P (a) * P (b) * P (c) * ... * P (n)
When ToCs with many components are confirmed, it
is strong evidence of impact, because:
– the chances of all components being observed
simultaneously were LOW
• P ( E ) low, specificity high
– If the ToC explaining impact holds true, the
probability of observing evidence of all
components is HIGH
• P ( E | I ) = sensitivity high
Conclusion: let’s use other branches of
mathematics
In the form of SET THEORY or PROBABILITY
THEORY (used differently than in frequentist statistics)
They provide new ways of dealing with uncertainty
SET THEORY helps with:
– Defining success in a more flexible, open and
inclusive way (being “on the right track”)
– Explaining success by defining and identifying
contributory causes rigorously through data
analysis (eg. with QCA)
PROBABILITY THEORY helps with:
– Assessing the strength of evidence in terms of
sensitivity, specificity and predictive value
– Qualitative evidence CAN be strong if a number
of conditions are met
– Carefully weigh each piece of evidence as in a court of law,
using conditional and subjective probabilities
References
Befani, B. (2013) “Between Complexity and Rigour:
addressing evaluation challenges with QCA” in
Evaluation (forthcoming)
Befani, B. (2013) “What were the chances?
Diagnostic Tests and Bayesian Tools to Assess the
Strength of Evidence in Impact Evaluation”, CDI
Practice Paper (forthcoming)
John Mayne (2013) “Making Causal Claims”
(presentation to this event)
Bruno Marchal (2013) “Conceptual distinctions:
Complexity and Systems – Making sense of
evaluation of complex programmes” (presentation to
this event)

More Related Content

PDF
Validity & reliability
PPTX
Hypothesis and tools
PPT
Validity of Evidence
PPT
Reliability and validity
PPTX
Quantitative Analysis
PPTX
Validity, reliability & practicality
PPTX
Reliability (assessment of student learning I)
PPTX
Reliability & validity
Validity & reliability
Hypothesis and tools
Validity of Evidence
Reliability and validity
Quantitative Analysis
Validity, reliability & practicality
Reliability (assessment of student learning I)
Reliability & validity

What's hot (19)

PPT
3 1 o psych research methods 2
PPTX
PPT
Validity and reliability in assessment.
PPTX
01 validity and its type
PDF
Validity &amp; reliability
PDF
1 Reliability and Validity in Physical Therapy Tests
PPTX
Hypothesis
ODP
Psychometric instrument development
DOCX
Variable
PDF
Research Design and Validity
PPT
Reliability and validity
PPTX
Measurement of variables IN RESEARCH
PPTX
Nonexperimental quantitative design
PPT
Research Process | Step By Step | Variables In Research |
PPT
construct and variables in research methodology
PPTX
Conceptualization, operationalization and measurement
PPT
Evaluation.2011intro
PDF
Validity and reliability
3 1 o psych research methods 2
Validity and reliability in assessment.
01 validity and its type
Validity &amp; reliability
1 Reliability and Validity in Physical Therapy Tests
Hypothesis
Psychometric instrument development
Variable
Research Design and Validity
Reliability and validity
Measurement of variables IN RESEARCH
Nonexperimental quantitative design
Research Process | Step By Step | Variables In Research |
construct and variables in research methodology
Conceptualization, operationalization and measurement
Evaluation.2011intro
Validity and reliability
Ad

Viewers also liked (10)

PDF
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Paper session...
PDF
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Paper Session...
PDF
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Paper Session...
PPTX
Towards a more ‘impact-oriented’ institutional M&E system:
PDF
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Paper Session...
PDF
IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper session ...
PDF
IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper session ...
PDF
IDS Impact, Innovation and LEarning Workshop March 2013: Day 1, Keynote 1 Rob...
PDF
IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper Session ...
PDF
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Paper session...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Paper Session...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Paper Session...
Towards a more ‘impact-oriented’ institutional M&E system:
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Paper Session...
IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper session ...
IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper session ...
IDS Impact, Innovation and LEarning Workshop March 2013: Day 1, Keynote 1 Rob...
IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper Session ...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
Ad

Similar to IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Paper session 4 Barbara Befani (20)

PPTX
Causality in Python PyCon 2021 ISRAEL
PDF
What is the Philosophy of Statistics? (and how I was drawn to it)
PPTX
Landau dunn 20march2013
PPT
Analytic Methods and Issues in CER from Observational Data
PPT
Epidemiology Lectures for UG
PDF
weon preconference 2013 vandenbroucke counterfactual theory causality epidem...
PDF
Mayo O&M slides (4-28-13)
PPTX
The Research Process
PPTX
The Statistics Wars: Errors and Casualties
PPTX
Russo-QUAL-QUAN-Bergen-september2024.pptx
PDF
D. G. Mayo Columbia slides for Workshop on Probability &Learning
PDF
Causal Inference in Data Science and Machine Learning
PDF
Causal Inference for Everyone
PDF
Feb21 mayobostonpaper
PPTX
“The Book of Why”.ppt
PDF
Probing with Severity: Beyond Bayesian Probabilism and Frequentist Performance
PDF
Causal Inference
PDF
Statistical Inference as Severe Testing: Beyond Performance and Probabilism
PPTX
Meeting #1 Slides Phil 6334/Econ 6614 SP2019
PDF
Causal inference is not statistical inference
Causality in Python PyCon 2021 ISRAEL
What is the Philosophy of Statistics? (and how I was drawn to it)
Landau dunn 20march2013
Analytic Methods and Issues in CER from Observational Data
Epidemiology Lectures for UG
weon preconference 2013 vandenbroucke counterfactual theory causality epidem...
Mayo O&M slides (4-28-13)
The Research Process
The Statistics Wars: Errors and Casualties
Russo-QUAL-QUAN-Bergen-september2024.pptx
D. G. Mayo Columbia slides for Workshop on Probability &Learning
Causal Inference in Data Science and Machine Learning
Causal Inference for Everyone
Feb21 mayobostonpaper
“The Book of Why”.ppt
Probing with Severity: Beyond Bayesian Probabilism and Frequentist Performance
Causal Inference
Statistical Inference as Severe Testing: Beyond Performance and Probabilism
Meeting #1 Slides Phil 6334/Econ 6614 SP2019
Causal inference is not statistical inference

More from Institute of Development Studies (20)

PPTX
Faculty from University of Mosul and University of Dohuk, Iraq, receive awars
PDF
PDF
PDF
PDF
PDF
PDF
PPTX
@Building Political Support for UN HRC Resolution 16/18, Foreign, Commonwealt...
PPTX
@Gabriel Piętka/MSZ CC BY-NC-ND 2.0 -
PPT
Sussex Development Lecture - SPRU intro
PPTX
Sussex Development Lecture - IDS intro
PPTX
Sussex Development Lecture - Global Studies intro
PDF
Sussex Development Lecture - Centre for International Education intro
PPTX
Advancing the frontiers of transparency and accountability in the extractives...
PPTX
The SDGs: A new politics of transformation?
PPTX
Climate and development: A tale of two crises
PPTX
Climate and development: A tale of two crises
Faculty from University of Mosul and University of Dohuk, Iraq, receive awars
@Building Political Support for UN HRC Resolution 16/18, Foreign, Commonwealt...
@Gabriel Piętka/MSZ CC BY-NC-ND 2.0 -
Sussex Development Lecture - SPRU intro
Sussex Development Lecture - IDS intro
Sussex Development Lecture - Global Studies intro
Sussex Development Lecture - Centre for International Education intro
Advancing the frontiers of transparency and accountability in the extractives...
The SDGs: A new politics of transformation?
Climate and development: A tale of two crises
Climate and development: A tale of two crises

Recently uploaded (20)

PPTX
Epidemiology of diptheria, pertusis and tetanus with their prevention
PPTX
Effects of lipid metabolism 22 asfelagi.pptx
PPT
Infections Member of Royal College of Physicians.ppt
PDF
focused on the development and application of glycoHILIC, pepHILIC, and comm...
PPTX
Electrolyte Disturbance in Paediatric - Nitthi.pptx
PPTX
Enteric duplication cyst, etiology and management
PPTX
Cardiovascular - antihypertensive medical backgrounds
PPT
Dermatology for member of royalcollege.ppt
PPTX
MANAGEMENT SNAKE BITE IN THE TROPICALS.pptx
PDF
Lecture 8- Cornea and Sclera .pdf 5tg year
PPTX
CHEM421 - Biochemistry (Chapter 1 - Introduction)
PDF
OSCE SERIES ( Questions & Answers ) - Set 5.pdf
PDF
B C German Homoeopathy Medicineby Dr Brij Mohan Prasad
PDF
Oral Aspect of Metabolic Disease_20250717_192438_0000.pdf
PPTX
Post Op complications in general surgery
PPT
neurology Member of Royal College of Physicians (MRCP).ppt
PDF
TISSUE LECTURE (anatomy and physiology )
PDF
Plant-Based Antimicrobials: A New Hope for Treating Diarrhea in HIV Patients...
PPTX
NRP and care of Newborn.pptx- APPT presentation about neonatal resuscitation ...
PPTX
1. Basic chemist of Biomolecule (1).pptx
Epidemiology of diptheria, pertusis and tetanus with their prevention
Effects of lipid metabolism 22 asfelagi.pptx
Infections Member of Royal College of Physicians.ppt
focused on the development and application of glycoHILIC, pepHILIC, and comm...
Electrolyte Disturbance in Paediatric - Nitthi.pptx
Enteric duplication cyst, etiology and management
Cardiovascular - antihypertensive medical backgrounds
Dermatology for member of royalcollege.ppt
MANAGEMENT SNAKE BITE IN THE TROPICALS.pptx
Lecture 8- Cornea and Sclera .pdf 5tg year
CHEM421 - Biochemistry (Chapter 1 - Introduction)
OSCE SERIES ( Questions & Answers ) - Set 5.pdf
B C German Homoeopathy Medicineby Dr Brij Mohan Prasad
Oral Aspect of Metabolic Disease_20250717_192438_0000.pdf
Post Op complications in general surgery
neurology Member of Royal College of Physicians (MRCP).ppt
TISSUE LECTURE (anatomy and physiology )
Plant-Based Antimicrobials: A New Hope for Treating Diarrhea in HIV Patients...
NRP and care of Newborn.pptx- APPT presentation about neonatal resuscitation ...
1. Basic chemist of Biomolecule (1).pptx

IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Paper session 4 Barbara Befani

  • 1. 27:03:2013 Set-theoretic, diagnostic and Bayesian approaches to impact evidence Barbara Befani Impact, Innovation and Learning: Towards a Research and Practice Agenda for the Future Brighton, 26-27 March 2013
  • 2. Outline Set-theoretic Methods (e.g. QCA) and the new challenges – Uncertainty (equifinality) – Causal contribution (multiple-conjunctural causality) – Causal asymmetry (necessity and sufficiency) Diagnostic and Bayesian approaches – Uncertainty (can be quantified with probabilities) – The strength of qualitative evidence can be measured
  • 3. Defining & explaining events with Set Theory In uncertain and emergent contexts, we cannot define “impact” (or success) precisely An Impact “space” of possible events, all desirable – All compatible with given values and goals Success is likely to look like ANY of a number of events = a LOGICAL UNION Success looks more like “being on the right track” than achieving a specific goal Being “on the right track” means avoiding a number of pitfalls / dead ends Sets can be defined as NEGATION of other sets The three main operators in set theory are – Negation, union, intersection
  • 6. Causal Asymmetry and Contribution Analysis What is a sufficient causal package (a branch of blue nodes) Principal Contributory Cause = INUS, a necessary part of the (sufficient) combination (each blue node of a given branch) In Set Theory terminology, a causal package is an INTERSECTION of contributory causes A combination of necessary causes (necessary within that causal package) Set Theory provides the mathematical basis for 1. analyzing causal contribution 2. dealing with uncertainty (particularly Fuzzy Sets)
  • 7. UNION of A and B A U B A OR B A + B INTERSECTION of A and B A ∩ B A AND B A * B A A B B NEGATION of A = ~A, NOT A
  • 8. Conf ID Cold Chain Integrity (necessary) Vaccine (V) (intervention) Health System Quality (HSQ) Success (S) (reduction in specific morbidity) No. Cases A 1 1 1 1 3 B 1 1 0 1 1 C 1 0 1 1 2 D 1 0 0 0 2 E 0 1 1 1/0 0 F 0 1 0 1/0 0 G 0 0 1 1/0 0 H 0 0 0 1/0 0 Why are Causal Combinations important? Impact is contingent on the context Finding ONE counterfactual is not enough QCA helps finding many counterfactuals through systematic cross- case comparison
  • 9. Probability and Diagnosis: what the evidence says Realm of “unknown knowns” General problem of the strength / quality of evidence: how to assess it? In clinical practice, physicians use tests – Specificity • Probability that absence of the disease will return negative evidence on that test – Sensitivity • Probability that presence of the disease will return positive evidence on that test – (Positive) Predictive Power • Probability that positive evidence signals presence of the disease
  • 10. When is evidence strong? When it is sensitive and specific – Sensitive: P ( Evidence | Impact ) high – Specific: <= P ( Evidence ) low • false positives are low – Predictive: P ( Impact | Evidence ) high – The latter can be calculated with the Bayes formula P ( I | E ) = P ( I ) * P ( E | I ) / P ( E ) Two important principles of high-quality evidence – of all kinds, quali, quanti, etc. Evidence is strong when: – The prior probability of observing positive evidence P ( E ) is LOW (~specificity) – The probability of observing positive evidence IF the intervention was successful / had an impact P ( E | I ) is HIGH (sensitivity)
  • 12. REALITY about success of intervention Intervention Successful (I) Intervention Unsuccessful (~I) EVIDENCE about success of intervention Evidence Positive (E) True Positive False Positive Evidence Negative (~E) False Negative True Negative Seeking evidence of impact: diagnostic tests
  • 13. REALITY about success of intervention Intervention Successful (I) Intervention UNsuccessful (~I) EVIDENCE about success of intervention Evidence Positive (E) True Positive False Positive Positive predictive value = Σ True Positive Σ Evidence Posi tive Evidence Negative (~E) False Negative True Negative Negative predictive value = Σ True Negative Σ Evidence Neg ative Sensitivity = Σ True Positive Σ Intervention Successful Specificity = Σ True Negative Σ Intervention UNsuccessful
  • 14. When articulated ToCs explaining impact are supported by evidence IT IS STRONG EVIDENCE OF IMPACT The prior probability of observing a sophisticated ToC with several components is LOW, because The probability of a combination is the product of the probability of components (a very SMALL #) P (a, b, c, ... N ) = P (a) * P (b) * P (c) * ... * P (n) When ToCs with many components are confirmed, it is strong evidence of impact, because: – the chances of all components being observed simultaneously were LOW • P ( E ) low, specificity high – If the ToC explaining impact holds true, the probability of observing evidence of all components is HIGH • P ( E | I ) = sensitivity high
  • 15. Conclusion: let’s use other branches of mathematics In the form of SET THEORY or PROBABILITY THEORY (used differently than in frequentist statistics) They provide new ways of dealing with uncertainty SET THEORY helps with: – Defining success in a more flexible, open and inclusive way (being “on the right track”) – Explaining success by defining and identifying contributory causes rigorously through data analysis (eg. with QCA) PROBABILITY THEORY helps with: – Assessing the strength of evidence in terms of sensitivity, specificity and predictive value – Qualitative evidence CAN be strong if a number of conditions are met – Carefully weigh each piece of evidence as in a court of law, using conditional and subjective probabilities
  • 16. References Befani, B. (2013) “Between Complexity and Rigour: addressing evaluation challenges with QCA” in Evaluation (forthcoming) Befani, B. (2013) “What were the chances? Diagnostic Tests and Bayesian Tools to Assess the Strength of Evidence in Impact Evaluation”, CDI Practice Paper (forthcoming) John Mayne (2013) “Making Causal Claims” (presentation to this event) Bruno Marchal (2013) “Conceptual distinctions: Complexity and Systems – Making sense of evaluation of complex programmes” (presentation to this event)