SlideShare a Scribd company logo
Dr. Kathy Newcomer, Steve Mumford, 
Yvonne Watson, and Nick Hart 
George Washington University
Purpose: Share perspectives on thorny 
evaluation dilemmas and cultivate a community 
of practice 
Introduction to systems thinking (10 min) 
Small group dialogue by perspective (25 min) 
 Internal evaluator conducting evaluations (#1) 
 Internal evaluator overseeing external eval (#2) 
 External evaluator (#3) 
Each group discusses problem scenarios from past 
evaluation work 
Debrief on applying systems thinking (10 min)
Systems: mental constructs that help us make 
sense of the world and organize its complexity 
Draws our attention to: 
 Interrelationships: how people, objects and 
processes are connected (patterns) and with what 
consequence within a particular context 
 Perspectives: how different people interpret a 
situation, judge success, and imagine alternatives 
 Boundaries: how we can make things manageable
Be reflective: observe yourself as others 
might, including your assumptions and values 
See the world from others’ perspectives 
Attend to emerging patterns 
Explore indirect influence 
Recognize your own involvement 
Act responsibly (ethically, with wise 
judgment) & take responsibility for actions
What are perspectives of different actors? 
What are important aspects of the context? Are 
additional details needed? 
What are ethical, practical, methodological, professional, 
other factors to consider? 
What are potential actions and their consequences? 
Which would you take and why? 
What assumptions and values apply? 
How might the scenario and action choice look different 
from a different perspective (e.g., internal vs. external 
evaluator)?
How did aspects of systems thinking show 
up in your group discussion? 
To what extent was systems thinking a 
helpful guide for thinking through scenarios? 
 What aspects were most or least useful? Why? 
 What other frameworks might add further value? 
Is dialogue about generic problem scenarios a 
useful method for generating diverse 
approaches to resolving thorny dilemmas?
Picture this: 
You produce an evaluation report for a small non-profit client that 
presents mixed and inconclusive results. 
While some areas of inquiry suggest possible positive outcomes, 
the data are not very compelling or high in quality, and are 
balanced by some negative outcomes in the same area. 
The client claims to understand the report and the reasons for 
your mixed conclusions. 
Then the following occurs: 
You attend a program-wide celebration hosted by the client. 
You notice a promotional handout summarizing evaluation 
findings, produced by the client without your knowledge. 
You realize that the handout includes only positive results and 
completely omits negative results and caveats. 
What do you do?
Picture this: 
You are conducting an evaluation for a client who leads a small, relatively 
“young” non-profit engaged in a new grant. 
Your evaluation contract and plan includes a focus group with program 
implementers about their experience with the program. 
This is your only opportunity to collect program implementation data to 
help interpret outcomes and suggest process improvements. 
Then the following occurs: 
A couple weeks before the focus group, the client demands to videotape 
the meeting and select complimentary quotes, compile them into a 
promotional video and place it on the non-profit’s website. 
You are concerned with confidentiality and misconstruing participants’ 
comments, but the client gives an ultimatumthat the first priority for the 
90 min focus group meeting is to produce a testimonial video. 
The client controls scheduling and communication of the focus group 
and insists implementers are not available for additional meetings. 
What do you do?
Marty Monell (Pesticides) 
Barnes Johnson (RCRA) OR Caroline Hoskinson (OUST) 
Picture this: 
The director of a non-profit engages you to survey staff members about 
perceptions of the organizational culture and the benefits of more evaluation. 
When analyzing responses, you see that some staff members provided harsh and 
personal critiques of the ED’s leadership style in open-ended comments. 
One particularly critical staff member accuses him of racist and sexist behavior. 
The online survey is anonymous, but the agency has a very small staff. 
What do you do? 
Follow-up: 
You alert the ED to the general themes of the critical comments in a one-on-one 
meeting, and he insists on hearing more detail. 
He shares that one staff member who completed the survey recently resigned 
shortly after a negative annual performance review. 
The ED believes this person supplied the harshest criticism out of spite. 
What do you do now?
Marty Monell (Pesticides) 
Barnes Johnson (RCRA) OR Caroline Hoskinson (OUST) 
Picture this: 
Your office has been charged to collaborate on an evaluation design and 
implementation for a demonstration project with two other agencies. 
When the funding was provided, senior leaders agreed on the policy-relevant 
questions to address in the evaluation. 
After participant recruitment begins: 
You learn from one of the collaborating agencies that a critical data 
element has not been included in a survey to address a key policy-relevant 
question. 
The collaborating partner refuses to modify the survey to include the 
critical data element. 
What do you do?
Marty Monell (Pesticides) 
Barnes Johnson (RCRA) OR Caroline Hoskinson (OUST) 
Picture this: 
After collecting program data, you engage in a thoughtful analysis to draw up 
findings and recommendations. 
You send written findings and recommendations to your organization’s senior 
leaders in advance of an in-person meeting. 
During your meeting 
The senior leaders express interest in your findings and appear supportive of your 
recommendations. 
An economist invited to the meeting speaks up to say your analysis is incomplete 
because you did not establish a causal relationship on which to base your 
recommendations. 
The senior leaders take note and immediately appear to become skeptical of your 
findings. 
What do you do?
Marty Monell (Pesticides) 
Barnes Johnson (RCRA) OR Caroline Hoskinson (OUST) 
Picture this: 
Tomorrow you are scheduled to meet with your organization's senior leaders to 
present your evaluation findings. 
Your written analysis and findings are being routed through an internal 
organizational clearance process in advance of the meeting. 
During the clearance process: 
Your immediate supervisor decides to edit your findings and change a key 
recommendation before submitting the materials to clearance, but does not 
discuss with you in advance. 
The changes to the findings are factually erroneous and the altered 
recommendation is not supported by the evaluation. 
After discovering the issue, you approach your supervisor to discuss the changes. 
Your supervisor insists the edits were correct and will not permit a retraction. 
What do you do?
Picture this: 
 The HQ office administers a national program implemented by the 
District office. 
 The District office initiates an evaluation to examine the efficiency and 
effectiveness of the program. 
During the course of the evaluation the following occurs: 
 The HQ office assigns a representative to the evaluation team that 
believes the District’s program should be shut down. 
 Rumors circulate that the HQ office plans to shut down the program 
while the evaluation is underway and 9 months from completion. 
 HQ personnel request a copy of the draft report to meet with 
members of Congress to discuss the program without involving any 
members from the District program or evaluation team. 
What’s an internal evaluator to do?
Picture this: 
 An office initiates an evaluation of an innovative program. 
 The contractor leading the evaluation is highly respected 
and has a solid track record for good quality work. 
 The evaluation results will be used to determine if the 
approach should be scaled up. 
During the evaluation the following occurs: 
 The contractor begins to submit work that is of questionable 
quality. 
 The client office loses confidence in the contractor’s ability 
to complete the work and begins to scrutinize every 
deliverable. 
What’s an internal evaluator to do?
Picture this: 
 An office initiates an evaluation of a program. 
 Three program staff are assigned to the evaluation team. 
 The evaluation seeks to determine program effectiveness 
and if at all possible, impact. 
During the evaluation the following occurs: 
 We discover no data exist to assess impact questions. 
 One staff person insists on using more rigorous methods 
despite the evaluation team’s agreement to focus on 
outcomes and insists on revisiting the issue at every 
meeting. 
 The contractor expresses concern regarding resource 
constraints required to respond to the staffer’s comments. 
What’s an evaluator to do?

More Related Content

PDF
Quantitative, Qualitative, and Intuitive Feature Prioritization
PDF
Financial analysis for product managers
DOCX
Case study template
PPT
From A Enginner To A Project Leader
PPTX
Communication Audit
PPTX
Communications Measurement presentation
PPTX
MBA case study presentation template
PDF
Evaluating Communication Programmes, Products and Campaigns: Training workshop
Quantitative, Qualitative, and Intuitive Feature Prioritization
Financial analysis for product managers
Case study template
From A Enginner To A Project Leader
Communication Audit
Communications Measurement presentation
MBA case study presentation template
Evaluating Communication Programmes, Products and Campaigns: Training workshop

What's hot (20)

PPTX
Yonix presents: It’s all about stakeholder communication
PPTX
How to answer case studies in marketing
PPT
Marketing Plans: The key to reaching your target
PPT
Crafting your evaluation plan
PPT
Case Study Project
PPT
Simplify, Optimize and Secure every Stakeholder Communication
PPT
1003 Product Evaluation Framework
PPTX
The proposal - wtf goes in it
PPTX
Communication Audit
PPTX
Expo V4
PPTX
Communications audit
PPT
Advanced program management constituency management
DOC
Case Study Template
PPT
Communications Audit: Five-M Framework
PPTX
Case study solving technique
PPT
Stutoday12
PDF
Traditional_Consulting_Approach_to_Assess_A_Clients_Current_State
PPT
Communications Audit
DOC
Communication audit
PPTX
Lesson slides understanding customer needs
Yonix presents: It’s all about stakeholder communication
How to answer case studies in marketing
Marketing Plans: The key to reaching your target
Crafting your evaluation plan
Case Study Project
Simplify, Optimize and Secure every Stakeholder Communication
1003 Product Evaluation Framework
The proposal - wtf goes in it
Communication Audit
Expo V4
Communications audit
Advanced program management constituency management
Case Study Template
Communications Audit: Five-M Framework
Case study solving technique
Stutoday12
Traditional_Consulting_Approach_to_Assess_A_Clients_Current_State
Communications Audit
Communication audit
Lesson slides understanding customer needs
Ad

Similar to Envisioning Solutions to Problem Scenarios in Evaluation (AEA 2014) (20)

PPT
June 20 2010 bsi christie
PPT
The nature of program evaluation
PPT
Taking the Mystery out of Project Evaluation
PDF
Evaluation approaches presented by hari bhusal
PPTX
BetterEvaluation: A framework for planning evaluations
PPTX
Communicating results to stakeholders
PPT
Developmental evaluation learning as you go
PDF
(eBook PDF) Evaluating Public and Community Health Programs 2nd Edition
PPT
Program Evaluations
DOCX
Prog EVAL Framework.docx
DOCX
Directions  Respond to the Case Study below using the S.O.A.P. fo.docx
PDF
(eBook PDF) Evaluating Public and Community Health Programs 2nd Edition
PDF
(eBook PDF) Evaluating Public and Community Health Programs 2nd Edition
PPTX
Interactive evaluation
PDF
Evaluating Public and Community Health Programs 2nd Edition, (Ebook PDF)
PPT
Donaldson & Christie 2019 FOUNDATIONS Workshop 8-14-19 (2).ppt
PPT
Street Jibe Evaluation
PPT
Issues of scale when moving from pilot projects to national role out.
PPTX
Week 1 - Introduction to Program Evaluation.pptx
PPTX
Seda emdedding learning technologies evaluating and sustainability3
June 20 2010 bsi christie
The nature of program evaluation
Taking the Mystery out of Project Evaluation
Evaluation approaches presented by hari bhusal
BetterEvaluation: A framework for planning evaluations
Communicating results to stakeholders
Developmental evaluation learning as you go
(eBook PDF) Evaluating Public and Community Health Programs 2nd Edition
Program Evaluations
Prog EVAL Framework.docx
Directions  Respond to the Case Study below using the S.O.A.P. fo.docx
(eBook PDF) Evaluating Public and Community Health Programs 2nd Edition
(eBook PDF) Evaluating Public and Community Health Programs 2nd Edition
Interactive evaluation
Evaluating Public and Community Health Programs 2nd Edition, (Ebook PDF)
Donaldson & Christie 2019 FOUNDATIONS Workshop 8-14-19 (2).ppt
Street Jibe Evaluation
Issues of scale when moving from pilot projects to national role out.
Week 1 - Introduction to Program Evaluation.pptx
Seda emdedding learning technologies evaluating and sustainability3
Ad

More from Nick Hart, Ph.D. (16)

PPTX
The Role of Federal Evaluation Activities in the “Evidence-Based” Policy and ...
PPTX
Transparency for the Modern Evidence Ecosystem
PPTX
2018 Policy Contours for Using Linked Administrative Data in Evidence-Based ...
PPTX
Improving Environmental Evidence in the US
PPTX
Evaluation Amidst Paperwork Reduction Act Reviews: Overcoming Challenges fro...
PPTX
Performance Partnership Case Presentation: Evaluation @EPA
PPTX
Articulating Program Impacts with Case Studies & Success Stories
PPSX
Developing Better Environmental Evidence in the U.S.: Determinants of Evaluat...
PPTX
Determinants of Evaluation Supply at the US EPA: A Case Study of the RCRA Ha...
PPSX
Insights from Program Evaluation for Retrospective Reviews of regulations
PPTX
Hart & Newcomer 2015 -- Change and Continuity: Lessons Learned from the Bush ...
PPTX
Looking Forward to the Future of Federal Evaluation Efforts: Addressing Ident...
PPTX
Using Evidence and Evaluation in Federal Budget Decisions
PPTX
Hart and Newcomer -- Change, Continuity and Lessons Unlearned: A Tale of Two ...
PPSX
Hart 2014 -- Overseeing the Enforcers (Delivered on Nov. 6, 2014)
PDF
Performance Measurement and Evaluation in the Obama Administration
The Role of Federal Evaluation Activities in the “Evidence-Based” Policy and ...
Transparency for the Modern Evidence Ecosystem
2018 Policy Contours for Using Linked Administrative Data in Evidence-Based ...
Improving Environmental Evidence in the US
Evaluation Amidst Paperwork Reduction Act Reviews: Overcoming Challenges fro...
Performance Partnership Case Presentation: Evaluation @EPA
Articulating Program Impacts with Case Studies & Success Stories
Developing Better Environmental Evidence in the U.S.: Determinants of Evaluat...
Determinants of Evaluation Supply at the US EPA: A Case Study of the RCRA Ha...
Insights from Program Evaluation for Retrospective Reviews of regulations
Hart & Newcomer 2015 -- Change and Continuity: Lessons Learned from the Bush ...
Looking Forward to the Future of Federal Evaluation Efforts: Addressing Ident...
Using Evidence and Evaluation in Federal Budget Decisions
Hart and Newcomer -- Change, Continuity and Lessons Unlearned: A Tale of Two ...
Hart 2014 -- Overseeing the Enforcers (Delivered on Nov. 6, 2014)
Performance Measurement and Evaluation in the Obama Administration

Recently uploaded (20)

PPTX
nose tajweed for the arabic alphabets for the responsive
PPTX
Presentation for DGJV QMS (PQP)_12.03.2025.pptx
PPTX
Emphasizing It's Not The End 08 06 2025.pptx
PPTX
Tablets And Capsule Preformulation Of Paracetamol
PPTX
2025-08-10 Joseph 02 (shared slides).pptx
PPTX
Human Mind & its character Characteristics
PDF
Presentation1 [Autosaved].pdf diagnosiss
PDF
oil_refinery_presentation_v1 sllfmfls.pdf
PPTX
INTERNATIONAL LABOUR ORAGNISATION PPT ON SOCIAL SCIENCE
PPTX
An Unlikely Response 08 10 2025.pptx
PPTX
water for all cao bang - a charity project
PDF
Tunisia's Founding Father(s) Pitch-Deck 2022.pdf
PDF
Swiggy’s Playbook: UX, Logistics & Monetization
DOCX
ENGLISH PROJECT FOR BINOD BIHARI MAHTO KOYLANCHAL UNIVERSITY
PPTX
Self management and self evaluation presentation
PPTX
MERISTEMATIC TISSUES (MERISTEMS) PPT PUBLIC
PDF
COLEAD A2F approach and Theory of Change
PPTX
The Effect of Human Resource Management Practice on Organizational Performanc...
PDF
Instagram's Product Secrets Unveiled with this PPT
PDF
natwest.pdf company description and business model
nose tajweed for the arabic alphabets for the responsive
Presentation for DGJV QMS (PQP)_12.03.2025.pptx
Emphasizing It's Not The End 08 06 2025.pptx
Tablets And Capsule Preformulation Of Paracetamol
2025-08-10 Joseph 02 (shared slides).pptx
Human Mind & its character Characteristics
Presentation1 [Autosaved].pdf diagnosiss
oil_refinery_presentation_v1 sllfmfls.pdf
INTERNATIONAL LABOUR ORAGNISATION PPT ON SOCIAL SCIENCE
An Unlikely Response 08 10 2025.pptx
water for all cao bang - a charity project
Tunisia's Founding Father(s) Pitch-Deck 2022.pdf
Swiggy’s Playbook: UX, Logistics & Monetization
ENGLISH PROJECT FOR BINOD BIHARI MAHTO KOYLANCHAL UNIVERSITY
Self management and self evaluation presentation
MERISTEMATIC TISSUES (MERISTEMS) PPT PUBLIC
COLEAD A2F approach and Theory of Change
The Effect of Human Resource Management Practice on Organizational Performanc...
Instagram's Product Secrets Unveiled with this PPT
natwest.pdf company description and business model

Envisioning Solutions to Problem Scenarios in Evaluation (AEA 2014)

  • 1. Dr. Kathy Newcomer, Steve Mumford, Yvonne Watson, and Nick Hart George Washington University
  • 2. Purpose: Share perspectives on thorny evaluation dilemmas and cultivate a community of practice Introduction to systems thinking (10 min) Small group dialogue by perspective (25 min)  Internal evaluator conducting evaluations (#1)  Internal evaluator overseeing external eval (#2)  External evaluator (#3) Each group discusses problem scenarios from past evaluation work Debrief on applying systems thinking (10 min)
  • 3. Systems: mental constructs that help us make sense of the world and organize its complexity Draws our attention to:  Interrelationships: how people, objects and processes are connected (patterns) and with what consequence within a particular context  Perspectives: how different people interpret a situation, judge success, and imagine alternatives  Boundaries: how we can make things manageable
  • 4. Be reflective: observe yourself as others might, including your assumptions and values See the world from others’ perspectives Attend to emerging patterns Explore indirect influence Recognize your own involvement Act responsibly (ethically, with wise judgment) & take responsibility for actions
  • 5. What are perspectives of different actors? What are important aspects of the context? Are additional details needed? What are ethical, practical, methodological, professional, other factors to consider? What are potential actions and their consequences? Which would you take and why? What assumptions and values apply? How might the scenario and action choice look different from a different perspective (e.g., internal vs. external evaluator)?
  • 6. How did aspects of systems thinking show up in your group discussion? To what extent was systems thinking a helpful guide for thinking through scenarios?  What aspects were most or least useful? Why?  What other frameworks might add further value? Is dialogue about generic problem scenarios a useful method for generating diverse approaches to resolving thorny dilemmas?
  • 7. Picture this: You produce an evaluation report for a small non-profit client that presents mixed and inconclusive results. While some areas of inquiry suggest possible positive outcomes, the data are not very compelling or high in quality, and are balanced by some negative outcomes in the same area. The client claims to understand the report and the reasons for your mixed conclusions. Then the following occurs: You attend a program-wide celebration hosted by the client. You notice a promotional handout summarizing evaluation findings, produced by the client without your knowledge. You realize that the handout includes only positive results and completely omits negative results and caveats. What do you do?
  • 8. Picture this: You are conducting an evaluation for a client who leads a small, relatively “young” non-profit engaged in a new grant. Your evaluation contract and plan includes a focus group with program implementers about their experience with the program. This is your only opportunity to collect program implementation data to help interpret outcomes and suggest process improvements. Then the following occurs: A couple weeks before the focus group, the client demands to videotape the meeting and select complimentary quotes, compile them into a promotional video and place it on the non-profit’s website. You are concerned with confidentiality and misconstruing participants’ comments, but the client gives an ultimatumthat the first priority for the 90 min focus group meeting is to produce a testimonial video. The client controls scheduling and communication of the focus group and insists implementers are not available for additional meetings. What do you do?
  • 9. Marty Monell (Pesticides) Barnes Johnson (RCRA) OR Caroline Hoskinson (OUST) Picture this: The director of a non-profit engages you to survey staff members about perceptions of the organizational culture and the benefits of more evaluation. When analyzing responses, you see that some staff members provided harsh and personal critiques of the ED’s leadership style in open-ended comments. One particularly critical staff member accuses him of racist and sexist behavior. The online survey is anonymous, but the agency has a very small staff. What do you do? Follow-up: You alert the ED to the general themes of the critical comments in a one-on-one meeting, and he insists on hearing more detail. He shares that one staff member who completed the survey recently resigned shortly after a negative annual performance review. The ED believes this person supplied the harshest criticism out of spite. What do you do now?
  • 10. Marty Monell (Pesticides) Barnes Johnson (RCRA) OR Caroline Hoskinson (OUST) Picture this: Your office has been charged to collaborate on an evaluation design and implementation for a demonstration project with two other agencies. When the funding was provided, senior leaders agreed on the policy-relevant questions to address in the evaluation. After participant recruitment begins: You learn from one of the collaborating agencies that a critical data element has not been included in a survey to address a key policy-relevant question. The collaborating partner refuses to modify the survey to include the critical data element. What do you do?
  • 11. Marty Monell (Pesticides) Barnes Johnson (RCRA) OR Caroline Hoskinson (OUST) Picture this: After collecting program data, you engage in a thoughtful analysis to draw up findings and recommendations. You send written findings and recommendations to your organization’s senior leaders in advance of an in-person meeting. During your meeting The senior leaders express interest in your findings and appear supportive of your recommendations. An economist invited to the meeting speaks up to say your analysis is incomplete because you did not establish a causal relationship on which to base your recommendations. The senior leaders take note and immediately appear to become skeptical of your findings. What do you do?
  • 12. Marty Monell (Pesticides) Barnes Johnson (RCRA) OR Caroline Hoskinson (OUST) Picture this: Tomorrow you are scheduled to meet with your organization's senior leaders to present your evaluation findings. Your written analysis and findings are being routed through an internal organizational clearance process in advance of the meeting. During the clearance process: Your immediate supervisor decides to edit your findings and change a key recommendation before submitting the materials to clearance, but does not discuss with you in advance. The changes to the findings are factually erroneous and the altered recommendation is not supported by the evaluation. After discovering the issue, you approach your supervisor to discuss the changes. Your supervisor insists the edits were correct and will not permit a retraction. What do you do?
  • 13. Picture this:  The HQ office administers a national program implemented by the District office.  The District office initiates an evaluation to examine the efficiency and effectiveness of the program. During the course of the evaluation the following occurs:  The HQ office assigns a representative to the evaluation team that believes the District’s program should be shut down.  Rumors circulate that the HQ office plans to shut down the program while the evaluation is underway and 9 months from completion.  HQ personnel request a copy of the draft report to meet with members of Congress to discuss the program without involving any members from the District program or evaluation team. What’s an internal evaluator to do?
  • 14. Picture this:  An office initiates an evaluation of an innovative program.  The contractor leading the evaluation is highly respected and has a solid track record for good quality work.  The evaluation results will be used to determine if the approach should be scaled up. During the evaluation the following occurs:  The contractor begins to submit work that is of questionable quality.  The client office loses confidence in the contractor’s ability to complete the work and begins to scrutinize every deliverable. What’s an internal evaluator to do?
  • 15. Picture this:  An office initiates an evaluation of a program.  Three program staff are assigned to the evaluation team.  The evaluation seeks to determine program effectiveness and if at all possible, impact. During the evaluation the following occurs:  We discover no data exist to assess impact questions.  One staff person insists on using more rigorous methods despite the evaluation team’s agreement to focus on outcomes and insists on revisiting the issue at every meeting.  The contractor expresses concern regarding resource constraints required to respond to the staffer’s comments. What’s an evaluator to do?

Editor's Notes

  • #2: Handouts of all slides (~40) and scenarios for each small group (~15) Steve starts off Include brief bios of presenters here
  • #3: Identify three group leaders (adjust group number and perspectives as necessary according to number of participants – may want to have them indicate perspectives by a show of hands to get sense of group sizes) At end, opportunity to share contact info to keep in touch and continue dialogue on thorny evaluation problems
  • #4: Systems are unique and individual ways of viewing the world, not necessarily reflective of any objective reality. Boundaries are necessarily arbitrary, and we need to think through the implications of the boundaries we choose Focus mostly on adopting different perspectives in discussing problem scenarios, but also consider patterns and boundaries
  • #5: Discuss bullet items in pairs of two, going down the list. First two relate to our outlook towards ourselves and others, second two to opportunities to exert influence (and avoid attempts at control, which might lead to defensive reactions), and third two to the actions we take (acknowledging that we are not simply neutral observers); all are interrelated
  • #6: Small group facilitators will pose short, real-life situations from practice, including key players, context & a dilemma/choice Apply systems thinking to a dialogue of potential actions you did/might take, possible considering some or all of the following questions Start with one scenario and practice applying the different questions. You might need to supply some additional detail, but only if absolutely necessary (and keep confidential). Then participants could offer their own scenarios; if no takers, move to #2. Probably time for about 3-4 total scenarios (~5 min each), depending on depth of discussion. Facilitators should monitor time and move on as needed, since scenarios are unlikely to reach a final consensus on an action to take. Kathy rotates around groups.
  • #7: Free-flowing large group discussion, facilitated by Kathy – add intro points of common themes from small group discussions as needed. Invite participants to share email info to keep in touch after session via email list
  • #8: Make handout with three scenarios