SlideShare a Scribd company logo
Outcome-Based Measurement

From Theory to Implementation
Ghebray Consulting
647-823-5402
ghebrayconsulting@rogers.com
Workshop Objectives
1. Examine the process of developing &
   implementing outcome-based measurement

2. Reflect on the value of embedding Theory of
   Change in developing a program theory or logic
   model

3. Explore how the process of developing &
   implementing outcome-based measurement can
   be a vehicle for building evaluative culture
Agenda
1. Presentation: An overview of measurement &
   evaluation (15 minutes)
2. Discussion (10 minutes)
3. Presentation: walking through the program planning
   cycle – embedding theory of change (15 minutes)
4. Break (10 minutes)
5. Small group discussion (50 minutes)
6. Presentation: Measurement plan (15 minutes)
7. Facilitated discussion: Building evaluative culture (25
   minutes)
8. Final thoughts & reflections (10 minutes)
Measurement vs. Evaluation
Measurement
•Measurement is the process of systematically
and regularly collecting data on program quality,
outputs, and outcomes for program participants
(Not intended to establish causal relationship)

Evaluation
•Evaluation is a form of research method –in-
depth and rigorous effort to measure program
impact and uses scientific research methods to
compare outcomes with what would happen in
absence of strategies or intervention(s)
INPUTS                   STRATEGIES                 OUTCOMES


What was      Which       What was     Which         Did
the           resources   the          strategies    participant
quantity &    were most   quantity     were most     s change?
quality of    important   & quality    important     If so, how
resources     for         of the       for           much and
used to       providing   strategies   achieving     in what
implement     “high       provided?    the desired   ways?
strategies?   quality”                 outcomes?
              strategy?
INPUTS          OUTPUTS       OUTCOMES          IMPACT


MONITORING      PERFORMANCE      OUTCOME
                                               IMPACT STUDY
                MEASUREMENT    MEASUREMENT
What is the      What is the   What are the    What are
quantity &       quantity      results for     the
quality of       and quality   participants    impacts
resources we     of the        and other       that can be
have invested    program or    stakeholders?   attributed
in this          services we                   to the
program? Are     have                          program or
they aligned     delivered?                    services?
with program
goals?
Monitoring Questions
• How many clients are in the program?

• What are the socio-demographic characteristic
  of those in the program?

• Are program participants part of the intended
  or target population?

• How are program funds being used?
Performance Measurement Questions
• How well was the program marketed or
   promoted?

• Did the program offer adequate and high
  quality strategy or intervention?

• How do clients perceive the program?

• Are clients satisfied with the service and their
  encounters with service providers?
Outcome Measurement Questions
•Did clients attain the intended outcomes?

•What did clients learn?

Impact Study Questions
•Have clients’ quality of life improved following
program experience? Is the program responsible
for their improved quality of life?

•Is client participation in the program
“responsible” for their improvement?
INPUTS        OUTPUTS          OUTCOMES       IMPACT

MONITORING PERFORMANCE        OUTCOME        IMPACT STUDY
     MEASUREMENT            MEASUREMENT

    DOCUMENTING             DOCUMENTING        RESEARCH

TRACKING INVESTMENT          PROGRESS         CAUSALITY

  SOME EVALUATION EXPERIENCE &           HIGH EVALUATION
       EXPERTISE REQUIRED                  EXPERIENCE &
                                        EXPERTISE REQUIRED
      LOW/REASONABLE COST
                                              HIGH COST

       SHORTER TIME                          LONGER TIME
Questions, Thoughts, Discussions…
Before Implementing Outcome-based
Measurement…
1. Thorough program description
• Program rationale, definition of the problem &
feasibility of strategies or interventions

• Clarity of causal assumptions - between resources
strategies or interventions and expected outcomes
as well as performance measures or indicators

2. Capacity & experience of program leaders to
facilitate and act on findings & lessons learned
Community                                        Design
   Analysis,                Design               M & E Plan
Asset & Needs              Program
 Assessment


                                                  Develop
                       Engage Key
 Decision                                      Data Collection
                      Stakeholders
 Making                                             Tools




   Share         Data                Collect      Implement
Findings &      Analysis              Data         Program
 Lessons
 Learned
CONTEXTUAL
                ASSUMPTIONS    INPUTS    STRATEGIES   OUTPUTS    OUTCOMES
  ANALYSIS



The           The         Resources     What Direct             Benefits
Reason        implicit    dedicated     program products        for clients
behind        theory      to the        does    of              during &
the           or belief   program       with    program         after the
need for      why the                   inputs strategies       program
the           program
program       will
              work
Employment Program Example
 CONTEXTUAL                                                STRATEGIES       OUTPUTS
                    ASSUMPTIONS           INPUTS                          -# of classes
   ANALYSIS                                              -Weekly soft
                    -The absence        -2 F.T.E Staff                    & mentorship
-People in                                               skills classes
                    of a job history                                      sessions
 our community:     perpetuates         -Program &
                                                         -Match clients & offered
                    Unemployment        counseling
-Have few job                                            mentors
                                        space                             -# & type of
skills, bad or no   -Job search skills
 jobs & limited                                          - 8 months of    clients
                    training helps     -Program
job histories                                            supervised job served
                    with job           supplies &
                                                         training &
                    Readiness          equipment
-Have few job                                            mentoring
training & job                                           sessions
                    -Job mentorship
placement            can enhance on        IMPACT                    OUTCOME
opportunities        the job learning   -Clients find      -Clients learn soft
                                        jobs, remain       skills & job strategies
                    -Job placement      employed &
                    helps establish     establish job      -Clients establish supportive
                    job history         histories          professional relationships
                                                           with mentors
Small Group Discussion…
Using Program Theory to Plan and
   implement Outcome-based
         Measurement
What does “success” look like?

  According to Mark Friedman
(2005), three basic questions can
 help us assess program success

1.How much did we do?

2.How well did we do it?

3.Is anyone better off?
Outcome Indicator (Is anyone better off?)
• Outcome indicators measure how much progress
  was made towards achieving the outcome - For
  an outcome “Newcomers are job ready to
  compete in Canadian job market”
Outcome indicators include:
• Rate of successful training & job placement completion
  (i.e. ability to prepare resume, positive feedback on
  mock interview & job placement

• Maintaining professional relationship with mentors 6
  month after end of program
Process Indicators (How well did we do it?)
• Process indicators measure ways in which
   program services are delivered – For “Provide
   high quality service”
Process indicators include:
• Qualification & experience of mentors

• Suitability of client/mentor match & job
  placement

• Rate of client satisfaction
Output Indicators (How much did we do?)
• Output indicators measure the quantity or the
  volume of services produced - For “Provide
  adequate support to targeted population”

Output indicators include:
- # & types services offered (i.e. learning sessions,
   employment counseling, mentorship etc.)

- # and characteristics of clients served

- # of mentors recruited & matched
Measuring Program “Success”                   #    %
1. Meet outreach target                                     150 100
2. Attendance during orientation, intake & level of         75   50
registration in program (# & type of clients served)
3. High attendance, participation, & rate of satisfaction   65   43
4. Knowledge of job search strategies, Canadian labor       50   33
market & work place culture

5. Successful completion of program (including job          45   30
placement)
7. #s employed in profession related at ($17.00/hour)       26   17

8. Favorable Job assessment (tracked quarterly)             20   13
9. Job retention (for at least one year)                    15   10
The Measurement & Evaluation Plan

                Program   Indicators    Data    Data       Who        When      Who Needs the
                Component               Sources Collection Collects   to        Findings &
                to be                           Methods Data          Collect   Lessons Learned
                Measured                                              Data
 did we do?
 How much




                Scope &
                Reach
 we do it it?
 How well did




                Program or
                Service
                Quality
better off?
Is anyone




                Outcome
                for
                Participants
Thoughts, Reflections, Questions…
Building Evaluative Culture
1. How do you create a shared understanding of
   measurement and evaluation? How do you make
   measurement & evaluation everyone’s business?

2. What does measurement & evaluation mean to
   your board, staff etc?

3. How do you balance between learning and
   accountability? How do you manage the
   conversation about measurement & evaluation
   with funders/staff? How do you negotiate what
   can and cannot be delivered?
References
• Crawford, P. & Bryce, P. (2002). Project monitoring
  and evaluation: A method for enhancing the
  efficiency and effectiveness of aid project
  implementation
• Friedman, M. (2005). Trying hard is not good
  enough: How to produce measurable
  improvements for customers and communities.
• Penna, R. & Phillips, W. (2004). Outcome
  Frameworks: An overview for practitioners
• Smith, M. (2010). Handbook of program evaluation
  for social work and health professionals
• www.tccgrp.com

More Related Content

PDF
From Strategy to Operations ProjectWorld 2013
PDF
APM PMO SIG generic presentation
PPSX
How to get started - managing a programme
DOC
Resume_Gaurav
DOC
Mukesh Resume1-Updated
PPT
ASAE Presentation: Managing Enterprise Projects
DOCX
Alok - Jan 16
PPTX
PMP Training - 09 project human resource management
From Strategy to Operations ProjectWorld 2013
APM PMO SIG generic presentation
How to get started - managing a programme
Resume_Gaurav
Mukesh Resume1-Updated
ASAE Presentation: Managing Enterprise Projects
Alok - Jan 16
PMP Training - 09 project human resource management

What's hot (20)

PDF
Project Management Competency Development_Trueventus
DOCX
Banjarin patel curriculam_vitae_19_01_2016
DOC
Deepak_Kumar_curriculum_vitae v1
DOC
A. rathi (2)
PPT
Management by Objectives from the views of Project Management and Coordination
DOC
Gopi BR
PPT
Project Manager Competency Overview
PDF
PDF
Project Management Proposal
PDF
Enterprise Project Management
PDF
PMP® Sample Questions 3
PDF
Performance Management_The A-Z of Strategy Execution
PDF
Managing performance appriasal at QU changing culture
PPT
Project management career seminar
PDF
Competency based training needs assessment form awais e siraj genzee solutions
PPTX
Portfolio management Challenges and Implementation
PPTX
PMP Preparatory Course Demo
Project Management Competency Development_Trueventus
Banjarin patel curriculam_vitae_19_01_2016
Deepak_Kumar_curriculum_vitae v1
A. rathi (2)
Management by Objectives from the views of Project Management and Coordination
Gopi BR
Project Manager Competency Overview
Project Management Proposal
Enterprise Project Management
PMP® Sample Questions 3
Performance Management_The A-Z of Strategy Execution
Managing performance appriasal at QU changing culture
Project management career seminar
Competency based training needs assessment form awais e siraj genzee solutions
Portfolio management Challenges and Implementation
PMP Preparatory Course Demo
Ad

Similar to D6 e6 outcome based measurement from theory to implementation (20)

PDF
If You Evaluate It, They Will Fund
PDF
Telling the impact story defining new metrics for library success - 2014
PPT
ROI training 98
PPTX
Addressing the Risks & Opportunities of Implementing an Outcomes Based Strategy
PPTX
Results Based Monitoring and Evaluation
PDF
1.11 Data and Performance Simplified
PDF
2.6 Expert Forum: Data and Performance Simplified
PPTX
Evaluation introduction
PDF
6.6 Family and Youth Program Measurement Simplified
PDF
1.8 Data and Performance Simplified (De Jong)
PDF
Measuring Strategic, Visible and Costly Learning Programs
PDF
Creating an outcomes framework for your organisation
PPTX
Putting Program Evaluation to Work for You
PPTX
Elements of Effective Practice - Design, Management & Evaluation
PPTX
Business impact of learning
DOCX
Assignment 2 Designing a Training ProgramDue Week 8 and worth 3.docx
PDF
Evaluation Quick Tricks and Easy Tools for Nonprofits
PPT
Full Program Design
PPT
Investor Thinking And Practices Presentation
If You Evaluate It, They Will Fund
Telling the impact story defining new metrics for library success - 2014
ROI training 98
Addressing the Risks & Opportunities of Implementing an Outcomes Based Strategy
Results Based Monitoring and Evaluation
1.11 Data and Performance Simplified
2.6 Expert Forum: Data and Performance Simplified
Evaluation introduction
6.6 Family and Youth Program Measurement Simplified
1.8 Data and Performance Simplified (De Jong)
Measuring Strategic, Visible and Costly Learning Programs
Creating an outcomes framework for your organisation
Putting Program Evaluation to Work for You
Elements of Effective Practice - Design, Management & Evaluation
Business impact of learning
Assignment 2 Designing a Training ProgramDue Week 8 and worth 3.docx
Evaluation Quick Tricks and Easy Tools for Nonprofits
Full Program Design
Investor Thinking And Practices Presentation
Ad

More from ocasiconference (20)

PDF
Overseas - Pre Arrival Services (CIIP) - Thomas Hope
PDF
Naomi Alboim - New Directions in Refugee Re-Settlement
PDF
Nouvelles Directive liées à l'installation des réfugiés - Naomi Alboim
PDF
Pre- Arrival Services - FR - Thomas Hope - CIIP
PPTX
Post- Secondary Programming and Services for Immigrants - Alex Irwin
PPT
If you're not counted you dont count - Notisha Massaquoi
PPTX
Using Community Research to inform Health and Social Policy for Immigrant and...
PPTX
Utilisation de la recherche communautaire - Brenda Roche
PPTX
La recherce en collaboration sur l'immgration et l'établissement: la perspect...
PPTX
Les programmes et services d'enseignment postsecondaire pour immigrants - Ale...
PPTX
Agency of the Future Presentation - Mario Calla
PPT
The Importance of Research in the Immigrant and Refugee Serving Sector- Notis...
PPTX
Cic settlement summit presentation o nv8-day 1-no notes
PPTX
OCASI Summit Day 2 - Express Entry System 2014
PPTX
Cic settlement summit presentation o nv8-day 2-fr
PPTX
Using Community Research to inform Health and Social Policy for Immigrant And...
PPTX
Presentation du CIC
PPTX
Cic settlement summit presentation o nv8-day 2-no notes
PPTX
Express entry for candidates fr
PPTX
A6 contribution agreement oct 23 semhar and carly
Overseas - Pre Arrival Services (CIIP) - Thomas Hope
Naomi Alboim - New Directions in Refugee Re-Settlement
Nouvelles Directive liées à l'installation des réfugiés - Naomi Alboim
Pre- Arrival Services - FR - Thomas Hope - CIIP
Post- Secondary Programming and Services for Immigrants - Alex Irwin
If you're not counted you dont count - Notisha Massaquoi
Using Community Research to inform Health and Social Policy for Immigrant and...
Utilisation de la recherche communautaire - Brenda Roche
La recherce en collaboration sur l'immgration et l'établissement: la perspect...
Les programmes et services d'enseignment postsecondaire pour immigrants - Ale...
Agency of the Future Presentation - Mario Calla
The Importance of Research in the Immigrant and Refugee Serving Sector- Notis...
Cic settlement summit presentation o nv8-day 1-no notes
OCASI Summit Day 2 - Express Entry System 2014
Cic settlement summit presentation o nv8-day 2-fr
Using Community Research to inform Health and Social Policy for Immigrant And...
Presentation du CIC
Cic settlement summit presentation o nv8-day 2-no notes
Express entry for candidates fr
A6 contribution agreement oct 23 semhar and carly

Recently uploaded (20)

PDF
pdfcoffee.com-opt-b1plus-sb-answers.pdfvi
PDF
Business model innovation report 2022.pdf
PDF
Unit 1 Cost Accounting - Cost sheet
DOCX
unit 1 COST ACCOUNTING AND COST SHEET
PPTX
Probability Distribution, binomial distribution, poisson distribution
PDF
SIMNET Inc – 2023’s Most Trusted IT Services & Solution Provider
PPTX
Dragon_Fruit_Cultivation_in Nepal ppt.pptx
DOCX
Euro SEO Services 1st 3 General Updates.docx
PDF
Outsourced Audit & Assurance in USA Why Globus Finanza is Your Trusted Choice
PDF
Roadmap Map-digital Banking feature MB,IB,AB
DOCX
Business Management - unit 1 and 2
PPT
Data mining for business intelligence ch04 sharda
PDF
COST SHEET- Tender and Quotation unit 2.pdf
PDF
kom-180-proposal-for-a-directive-amending-directive-2014-45-eu-and-directive-...
PPTX
New Microsoft PowerPoint Presentation - Copy.pptx
PPTX
job Avenue by vinith.pptxvnbvnvnvbnvbnbmnbmbh
PDF
A Brief Introduction About Julia Allison
PDF
Power and position in leadershipDOC-20250808-WA0011..pdf
PPTX
CkgxkgxydkydyldylydlydyldlyddolydyoyyU2.pptx
PDF
Laughter Yoga Basic Learning Workshop Manual
pdfcoffee.com-opt-b1plus-sb-answers.pdfvi
Business model innovation report 2022.pdf
Unit 1 Cost Accounting - Cost sheet
unit 1 COST ACCOUNTING AND COST SHEET
Probability Distribution, binomial distribution, poisson distribution
SIMNET Inc – 2023’s Most Trusted IT Services & Solution Provider
Dragon_Fruit_Cultivation_in Nepal ppt.pptx
Euro SEO Services 1st 3 General Updates.docx
Outsourced Audit & Assurance in USA Why Globus Finanza is Your Trusted Choice
Roadmap Map-digital Banking feature MB,IB,AB
Business Management - unit 1 and 2
Data mining for business intelligence ch04 sharda
COST SHEET- Tender and Quotation unit 2.pdf
kom-180-proposal-for-a-directive-amending-directive-2014-45-eu-and-directive-...
New Microsoft PowerPoint Presentation - Copy.pptx
job Avenue by vinith.pptxvnbvnvnvbnvbnbmnbmbh
A Brief Introduction About Julia Allison
Power and position in leadershipDOC-20250808-WA0011..pdf
CkgxkgxydkydyldylydlydyldlyddolydyoyyU2.pptx
Laughter Yoga Basic Learning Workshop Manual

D6 e6 outcome based measurement from theory to implementation

  • 1. Outcome-Based Measurement From Theory to Implementation Ghebray Consulting 647-823-5402 ghebrayconsulting@rogers.com
  • 2. Workshop Objectives 1. Examine the process of developing & implementing outcome-based measurement 2. Reflect on the value of embedding Theory of Change in developing a program theory or logic model 3. Explore how the process of developing & implementing outcome-based measurement can be a vehicle for building evaluative culture
  • 3. Agenda 1. Presentation: An overview of measurement & evaluation (15 minutes) 2. Discussion (10 minutes) 3. Presentation: walking through the program planning cycle – embedding theory of change (15 minutes) 4. Break (10 minutes) 5. Small group discussion (50 minutes) 6. Presentation: Measurement plan (15 minutes) 7. Facilitated discussion: Building evaluative culture (25 minutes) 8. Final thoughts & reflections (10 minutes)
  • 5. Measurement •Measurement is the process of systematically and regularly collecting data on program quality, outputs, and outcomes for program participants (Not intended to establish causal relationship) Evaluation •Evaluation is a form of research method –in- depth and rigorous effort to measure program impact and uses scientific research methods to compare outcomes with what would happen in absence of strategies or intervention(s)
  • 6. INPUTS STRATEGIES OUTCOMES What was Which What was Which Did the resources the strategies participant quantity & were most quantity were most s change? quality of important & quality important If so, how resources for of the for much and used to providing strategies achieving in what implement “high provided? the desired ways? strategies? quality” outcomes? strategy?
  • 7. INPUTS OUTPUTS OUTCOMES IMPACT MONITORING PERFORMANCE OUTCOME IMPACT STUDY MEASUREMENT MEASUREMENT What is the What is the What are the What are quantity & quantity results for the quality of and quality participants impacts resources we of the and other that can be have invested program or stakeholders? attributed in this services we to the program? Are have program or they aligned delivered? services? with program goals?
  • 8. Monitoring Questions • How many clients are in the program? • What are the socio-demographic characteristic of those in the program? • Are program participants part of the intended or target population? • How are program funds being used?
  • 9. Performance Measurement Questions • How well was the program marketed or promoted? • Did the program offer adequate and high quality strategy or intervention? • How do clients perceive the program? • Are clients satisfied with the service and their encounters with service providers?
  • 10. Outcome Measurement Questions •Did clients attain the intended outcomes? •What did clients learn? Impact Study Questions •Have clients’ quality of life improved following program experience? Is the program responsible for their improved quality of life? •Is client participation in the program “responsible” for their improvement?
  • 11. INPUTS OUTPUTS OUTCOMES IMPACT MONITORING PERFORMANCE OUTCOME IMPACT STUDY MEASUREMENT MEASUREMENT DOCUMENTING DOCUMENTING RESEARCH TRACKING INVESTMENT PROGRESS CAUSALITY SOME EVALUATION EXPERIENCE & HIGH EVALUATION EXPERTISE REQUIRED EXPERIENCE & EXPERTISE REQUIRED LOW/REASONABLE COST HIGH COST SHORTER TIME LONGER TIME
  • 13. Before Implementing Outcome-based Measurement… 1. Thorough program description • Program rationale, definition of the problem & feasibility of strategies or interventions • Clarity of causal assumptions - between resources strategies or interventions and expected outcomes as well as performance measures or indicators 2. Capacity & experience of program leaders to facilitate and act on findings & lessons learned
  • 14. Community Design Analysis, Design M & E Plan Asset & Needs Program Assessment Develop Engage Key Decision Data Collection Stakeholders Making Tools Share Data Collect Implement Findings & Analysis Data Program Lessons Learned
  • 15. CONTEXTUAL ASSUMPTIONS INPUTS STRATEGIES OUTPUTS OUTCOMES ANALYSIS The The Resources What Direct Benefits Reason implicit dedicated program products for clients behind theory to the does of during & the or belief program with program after the need for why the inputs strategies program the program program will work
  • 16. Employment Program Example CONTEXTUAL STRATEGIES OUTPUTS ASSUMPTIONS INPUTS -# of classes ANALYSIS -Weekly soft -The absence -2 F.T.E Staff & mentorship -People in skills classes of a job history sessions our community: perpetuates -Program & -Match clients & offered Unemployment counseling -Have few job mentors space -# & type of skills, bad or no -Job search skills jobs & limited - 8 months of clients training helps -Program job histories supervised job served with job supplies & training & Readiness equipment -Have few job mentoring training & job sessions -Job mentorship placement can enhance on IMPACT OUTCOME opportunities the job learning -Clients find -Clients learn soft jobs, remain skills & job strategies -Job placement employed & helps establish establish job -Clients establish supportive job history histories professional relationships with mentors
  • 18. Using Program Theory to Plan and implement Outcome-based Measurement
  • 19. What does “success” look like? According to Mark Friedman (2005), three basic questions can help us assess program success 1.How much did we do? 2.How well did we do it? 3.Is anyone better off?
  • 20. Outcome Indicator (Is anyone better off?) • Outcome indicators measure how much progress was made towards achieving the outcome - For an outcome “Newcomers are job ready to compete in Canadian job market” Outcome indicators include: • Rate of successful training & job placement completion (i.e. ability to prepare resume, positive feedback on mock interview & job placement • Maintaining professional relationship with mentors 6 month after end of program
  • 21. Process Indicators (How well did we do it?) • Process indicators measure ways in which program services are delivered – For “Provide high quality service” Process indicators include: • Qualification & experience of mentors • Suitability of client/mentor match & job placement • Rate of client satisfaction
  • 22. Output Indicators (How much did we do?) • Output indicators measure the quantity or the volume of services produced - For “Provide adequate support to targeted population” Output indicators include: - # & types services offered (i.e. learning sessions, employment counseling, mentorship etc.) - # and characteristics of clients served - # of mentors recruited & matched
  • 23. Measuring Program “Success” # % 1. Meet outreach target 150 100 2. Attendance during orientation, intake & level of 75 50 registration in program (# & type of clients served) 3. High attendance, participation, & rate of satisfaction 65 43 4. Knowledge of job search strategies, Canadian labor 50 33 market & work place culture 5. Successful completion of program (including job 45 30 placement) 7. #s employed in profession related at ($17.00/hour) 26 17 8. Favorable Job assessment (tracked quarterly) 20 13 9. Job retention (for at least one year) 15 10
  • 24. The Measurement & Evaluation Plan Program Indicators Data Data Who When Who Needs the Component Sources Collection Collects to Findings & to be Methods Data Collect Lessons Learned Measured Data did we do? How much Scope & Reach we do it it? How well did Program or Service Quality better off? Is anyone Outcome for Participants
  • 26. Building Evaluative Culture 1. How do you create a shared understanding of measurement and evaluation? How do you make measurement & evaluation everyone’s business? 2. What does measurement & evaluation mean to your board, staff etc? 3. How do you balance between learning and accountability? How do you manage the conversation about measurement & evaluation with funders/staff? How do you negotiate what can and cannot be delivered?
  • 27. References • Crawford, P. & Bryce, P. (2002). Project monitoring and evaluation: A method for enhancing the efficiency and effectiveness of aid project implementation • Friedman, M. (2005). Trying hard is not good enough: How to produce measurable improvements for customers and communities. • Penna, R. & Phillips, W. (2004). Outcome Frameworks: An overview for practitioners • Smith, M. (2010). Handbook of program evaluation for social work and health professionals • www.tccgrp.com