SlideShare a Scribd company logo
Developing Survey Goals Smart Start Funded Programs  use their Contract Activity Design (CAD). Non Smart Start funded programs  use their agency goals aligned with  PFC goals PFC  Goals   PBIS (Outcomes) PSC (Activity/strategy) Goal 1 Health Goal 4 System Support Goal 3 Early Care and Education Goal 2 Family Support Early Intervention  Health Care Compliance Child Care Placements Staff Education Staff Benefits Stability Parenting Skills  Family Literacy Early Intervention/Spec. Ed Family Support Pre-K Classes Parent Education Literacy programs CCR&R, More at Four, Subsidy, Quality Enhancement/Maintenance Training, Prof. Development Program Eval/Coordination, System Integration, Community Outreach,  Family Resource Center Predict needs of client,  community or system Measure satisfaction with programs, services Determine change in community or client behaviors, perspectives Evaluate program effectiveness Assess program impact on client, community, system Gauge knowledge gained and implemented These goals overlap; they are not mutually exclusive.  Many can be examined within one survey
Closed Ended (Quantitative)   Open Ended (Qualitative)   Responses are researchers  ‘acceptable’ choices Makes responses socially acceptable Responses are mutually exclusive  Narrative form replaces vague responses  Range of responses exceeds appropriate question length  Measure preferences or behaviors  Measure knowledge (pre-or post)  Questions and answers are consistent and understandable Questions are administered to respondents in the same manner each time Respondents have access to information needed to answer the question Respondents are willing to provide answers Five Challenges to Writing Good Questions Defining objectives; specifying answers to meet those objectives Ensuring respondents have shared, common understanding of question meaning Respondents are asked questions that they can answer Respondents can answer in terms required by the question Questions are asked in a way respondents are willing to answer Determine Survey Goals :  Smart Start Contract Activity Design (CAD)  or agency/program goals Determine Survey Scope :  Inquiry, Evaluation or Predictive Survey Question Development  Question and responses are standardized
Children 0-5 years old and not in Kindergarten Children 5 yrs + and in Kindergarten Cumberland County (all persons) Households with children Households without children Persons interested in the well-being of children, families, schools and community Children and Health Children and Early Care & Education  Children and Families   Children and System Support  Subsets of Children  Subsets of Community (With or without children)  Legislator/Policy Makers School Personnel  (K-12)  Direct Service Providers Community Leaders/Stakeholders   Determining Survey Populations
Design Analysis Evaluation Survey Question Bank Centralized database of survey questions aligned with Performance Based Incentive Standards (PBIS) and Purpose Service Codes (PSC)  Aid in survey design Questions are evaluated, pre-tested Streamline data collection Outputs track service area progress  Smart Start, local agencies evaluate similar areas Pre-coded responses Continuity in question  content Continuity in question  format Organization by topic area 4-stage evaluation process 1-year trial PFC goal, PBIS, PSC grouping By time and individual question Linked by age cohort, service areas Construct Validity Correlation of several questions  asked in a similar way Predictive Validity Measures that predict answers to  other related questions. Discriminant Validity Extent of differentiation in expected responses
Customers/clients - Are Smart Start grantees  Are non-Smart Start funded  Serve children/families in K-12.  Customer/clients identify their programs’ primary focus as - Health Early care and education Family support System support Surveys are designed by R&D Surveys are analyzed by R&D Questions are created annually supporting - Smart Start Purpose Service Code (PSC) Smart Start Performance Based Incentive System (PBIS)  Customers/clients report –  Selecting questions from the survey database. Creating their own surveys. Question bank is current in outputs and research.  Customers/clients report -  Increased understanding of survey design, evaluation Strong satisfaction or satisfaction with survey design and evaluation technical assistance. Learning 1-new tool/skill  Implementing the new/tool in their programs Referring this resource to others  Strengthen program and community data collection  through integrative survey development processes  Objective 2:  A centralized database of questions and  templates for survey design. Objective 3:  Customers/clients use surveys to collect program data.   Objective 1:  Data is collected from  a variety of community organizations.   PFC Assumptions Aligning Smart Start goals and outcomes with survey questions supports targeted evaluations. Pre-determined questions assist providers in coordinating survey goals with desired outcomes.  Providing technical assistance in survey design and analysis increases reliable data collection. Data collection from like-mission agencies reinforces tracking systems with similar outcomes. Demystifying survey design and analysis increases the use of surveys in program evaluation.

More Related Content

PPTX
2012 NCICU Conference Presentaion
PPT
Prt 595 week 2 lecture 2
PPT
School counselors using data
PPTX
EDUC 4206/6206 Nature of Assessments
DOC
McCann-Woods Resume-Skills Based
PPTX
Data, evidence and outcomes
PPT
What You Need To Know To Be A Better Recruiter1
PPTX
Expanding Possibilities Program Evaluation
2012 NCICU Conference Presentaion
Prt 595 week 2 lecture 2
School counselors using data
EDUC 4206/6206 Nature of Assessments
McCann-Woods Resume-Skills Based
Data, evidence and outcomes
What You Need To Know To Be A Better Recruiter1
Expanding Possibilities Program Evaluation

What's hot (20)

DOC
Standard 5 Artifacts Decision Making
PPT
Carrie slides
PPTX
Bowen fp mch
PDF
EERS Presentation (Justin and Sarah)
PPTX
Reference transaction
PPT
Modernizing Idaho Medicaid: Prevention, Wellness & Responsibility
PPTX
Tools for Measuring Place-Based Systems Change
DOCX
resume.no address
PPT
Wosmek Review Paper Presentation
PPTX
Academic detailing
PPTX
Improving impact - do accountability mechanisms deliver results
PPT
Building African Advocacy Through Evaluation
PPT
A3 lewis etoria
PPT
Insight to child poverty
PPS
Lessons Learned from OVC Evaluations for Future Public Health Evaluations
PPTX
Easterday Poster Presentation 8500
PPTX
Staff & Consumer Social Media Practice: Findings & Next Steps (30 Nov 2016)
PPTX
The decision support practices and research needs of nonprofits
DOC
Ash edu 675 week 4 dq 1 observation reconnaissance new
Standard 5 Artifacts Decision Making
Carrie slides
Bowen fp mch
EERS Presentation (Justin and Sarah)
Reference transaction
Modernizing Idaho Medicaid: Prevention, Wellness & Responsibility
Tools for Measuring Place-Based Systems Change
resume.no address
Wosmek Review Paper Presentation
Academic detailing
Improving impact - do accountability mechanisms deliver results
Building African Advocacy Through Evaluation
A3 lewis etoria
Insight to child poverty
Lessons Learned from OVC Evaluations for Future Public Health Evaluations
Easterday Poster Presentation 8500
Staff & Consumer Social Media Practice: Findings & Next Steps (30 Nov 2016)
The decision support practices and research needs of nonprofits
Ash edu 675 week 4 dq 1 observation reconnaissance new
Ad

Viewers also liked (9)

PPT
International Rapid Disaster Response in Case of Flooding
PDF
JGG - Digital Capability Offering
PDF
Point Nine Marketing Presentation
PPTX
Question 5 media
DOCX
Tugas Uas Komunikasi bisnis NURHAMDANI
PDF
PC magazine_7.2016
DOCX
Pengaruh facebook dalam memperkuat strategi komunikasi bisnis perusahaan
PDF
The Woodlands TX | Home Sales Month By Month - October 2016
PPTX
An Assessment of the level of vulnerability to climate change risks in a deve...
International Rapid Disaster Response in Case of Flooding
JGG - Digital Capability Offering
Point Nine Marketing Presentation
Question 5 media
Tugas Uas Komunikasi bisnis NURHAMDANI
PC magazine_7.2016
Pengaruh facebook dalam memperkuat strategi komunikasi bisnis perusahaan
The Woodlands TX | Home Sales Month By Month - October 2016
An Assessment of the level of vulnerability to climate change risks in a deve...
Ad

Similar to Survey Design Series (20)

PPT
Evaluation Principles: Theory-Based, Utilization-Focused, Participatory
PPTX
A guide for comprehensive needs assessment
PPTX
A guide for comprehensive needs assessment
PPT
Going To Scale District Level MO SW-PBS SI 2008
PPT
Full Program Design
PDF
Assessment
PPT
Feature Forum: Leveraging Surveys
PPT
Evaluating Impact of OVC Programs: Standardizing our methods
PPTX
Wishlife: Logic Model For CSR Planning (base version)
PPTX
Community engagement - what constitutes success
PPT
The Grants Academy
PDF
CCIH-2017-Monitoring-and-Evaluation-Preconference-Module-2
PDF
IDEA Best Practice Chairs & Admin
PPT
Program Evaluation In the Non-Profit Sector
PPT
Best Practices in Nonprofit Impact Measurement , CNM
PPT
Grant Writing: Summary Concepts 1 4
PDF
CCIH-2017-Monitoring-and-Evaluation-Preconference-Outline
PPT
Evaluation: Lessons Learned for the Global Health Initiative
PPT
Basics Of Grant Writing from Precise Edit
PDF
The Four Questions You Must Ask to Transform Your Prevention Strategy from Go...
Evaluation Principles: Theory-Based, Utilization-Focused, Participatory
A guide for comprehensive needs assessment
A guide for comprehensive needs assessment
Going To Scale District Level MO SW-PBS SI 2008
Full Program Design
Assessment
Feature Forum: Leveraging Surveys
Evaluating Impact of OVC Programs: Standardizing our methods
Wishlife: Logic Model For CSR Planning (base version)
Community engagement - what constitutes success
The Grants Academy
CCIH-2017-Monitoring-and-Evaluation-Preconference-Module-2
IDEA Best Practice Chairs & Admin
Program Evaluation In the Non-Profit Sector
Best Practices in Nonprofit Impact Measurement , CNM
Grant Writing: Summary Concepts 1 4
CCIH-2017-Monitoring-and-Evaluation-Preconference-Outline
Evaluation: Lessons Learned for the Global Health Initiative
Basics Of Grant Writing from Precise Edit
The Four Questions You Must Ask to Transform Your Prevention Strategy from Go...

Survey Design Series

  • 1. Developing Survey Goals Smart Start Funded Programs use their Contract Activity Design (CAD). Non Smart Start funded programs use their agency goals aligned with PFC goals PFC Goals PBIS (Outcomes) PSC (Activity/strategy) Goal 1 Health Goal 4 System Support Goal 3 Early Care and Education Goal 2 Family Support Early Intervention Health Care Compliance Child Care Placements Staff Education Staff Benefits Stability Parenting Skills Family Literacy Early Intervention/Spec. Ed Family Support Pre-K Classes Parent Education Literacy programs CCR&R, More at Four, Subsidy, Quality Enhancement/Maintenance Training, Prof. Development Program Eval/Coordination, System Integration, Community Outreach, Family Resource Center Predict needs of client, community or system Measure satisfaction with programs, services Determine change in community or client behaviors, perspectives Evaluate program effectiveness Assess program impact on client, community, system Gauge knowledge gained and implemented These goals overlap; they are not mutually exclusive. Many can be examined within one survey
  • 2. Closed Ended (Quantitative) Open Ended (Qualitative) Responses are researchers ‘acceptable’ choices Makes responses socially acceptable Responses are mutually exclusive Narrative form replaces vague responses Range of responses exceeds appropriate question length Measure preferences or behaviors Measure knowledge (pre-or post) Questions and answers are consistent and understandable Questions are administered to respondents in the same manner each time Respondents have access to information needed to answer the question Respondents are willing to provide answers Five Challenges to Writing Good Questions Defining objectives; specifying answers to meet those objectives Ensuring respondents have shared, common understanding of question meaning Respondents are asked questions that they can answer Respondents can answer in terms required by the question Questions are asked in a way respondents are willing to answer Determine Survey Goals : Smart Start Contract Activity Design (CAD) or agency/program goals Determine Survey Scope : Inquiry, Evaluation or Predictive Survey Question Development Question and responses are standardized
  • 3. Children 0-5 years old and not in Kindergarten Children 5 yrs + and in Kindergarten Cumberland County (all persons) Households with children Households without children Persons interested in the well-being of children, families, schools and community Children and Health Children and Early Care & Education Children and Families Children and System Support Subsets of Children Subsets of Community (With or without children) Legislator/Policy Makers School Personnel (K-12) Direct Service Providers Community Leaders/Stakeholders Determining Survey Populations
  • 4. Design Analysis Evaluation Survey Question Bank Centralized database of survey questions aligned with Performance Based Incentive Standards (PBIS) and Purpose Service Codes (PSC) Aid in survey design Questions are evaluated, pre-tested Streamline data collection Outputs track service area progress Smart Start, local agencies evaluate similar areas Pre-coded responses Continuity in question content Continuity in question format Organization by topic area 4-stage evaluation process 1-year trial PFC goal, PBIS, PSC grouping By time and individual question Linked by age cohort, service areas Construct Validity Correlation of several questions asked in a similar way Predictive Validity Measures that predict answers to other related questions. Discriminant Validity Extent of differentiation in expected responses
  • 5. Customers/clients - Are Smart Start grantees Are non-Smart Start funded Serve children/families in K-12. Customer/clients identify their programs’ primary focus as - Health Early care and education Family support System support Surveys are designed by R&D Surveys are analyzed by R&D Questions are created annually supporting - Smart Start Purpose Service Code (PSC) Smart Start Performance Based Incentive System (PBIS) Customers/clients report – Selecting questions from the survey database. Creating their own surveys. Question bank is current in outputs and research. Customers/clients report - Increased understanding of survey design, evaluation Strong satisfaction or satisfaction with survey design and evaluation technical assistance. Learning 1-new tool/skill Implementing the new/tool in their programs Referring this resource to others Strengthen program and community data collection through integrative survey development processes Objective 2: A centralized database of questions and templates for survey design. Objective 3: Customers/clients use surveys to collect program data. Objective 1: Data is collected from a variety of community organizations. PFC Assumptions Aligning Smart Start goals and outcomes with survey questions supports targeted evaluations. Pre-determined questions assist providers in coordinating survey goals with desired outcomes. Providing technical assistance in survey design and analysis increases reliable data collection. Data collection from like-mission agencies reinforces tracking systems with similar outcomes. Demystifying survey design and analysis increases the use of surveys in program evaluation.