SlideShare a Scribd company logo
Veronica Diaz, PhDAssociate DirectorEDUCAUSE Learning Initiative, EDUCAUSE:::League for InnovationLearning College Summit, Phoenix, AZSeeking Evidence of Impact: Answering "How Do We Know"
Today’s TalkReview what it is to seek impact of (teaching and learning innovations) Consider some strategies for using evaluation tools effectively Determine ways to use evidence to influence teaching practices Review ways to report results
Academic instructionFaculty developmentInstructional technologyInstructional designLibraryInformation technologySenior administration Other Who are we?
Why are we here?
I am working in evaluating T&L innovations now.	Evaluation of T&L is part of my formal job description.My campus units director or VP mandates our gathering evidence of impact.		A senior member of the administration (dean, president, senior vice-president) mandates gathering evidence of impact in T&L.	I am working as part of a team to gather evidence.Accreditation processes are prompting us to begin measurement work.
Why evidence of impact?
Seeking Evidence of Impact: Answering "How Do We Know?"
Seeking Evidence of Impact: Answering "How Do We Know?"
What the community saidDownload the Surveyhttps://docs.google.com/document/d/1Yj37DINUdCyk5DXPelr0FJ1Rewfx8xON-cLo-NJ5e4U/edit?hl=en_US#
Technologies to MeasureWeb conferencing LMS and individual features Lecture capture Mobile learning tools (laptops, ebooks, tablets) ClickersCollaborative tools Student generated contentWeb 2.0 and social networking technologies Learning spaces OERPersonal learning environments Online learning: hyflex course design, blended learning programs, synchronous/asynchronous delivery modes, fully online programs EportfoliosMultimedia projects and tools: pod/vod casts Simulations Early alert systems Cross-curricular information literacy programsLarge course redesigns
Technologies and their connection/relationship to…Student engagement Learning related interactions Shrink the large class Improve student to faculty interaction Student retention and success Specific learning outcomes
123 most important indicators you use to measure the evidence of impact of technology-based innovations in T&L
What is “evidence?”Grades (was frequently mentioned) Learning outcomes (was frequently mentioned) SatisfactionSkills Improved course evaluations Measures of engagement and participation Retention/enrollment rates Graduation rate Direct measures of student performance (at the course level and cumulative) Interview dataInstitutional data Faculty/student technology utilization ratesData on student/faculty facility and satisfaction with using technologySuccessfully implementing technology Job placement Student artifacts Better faculty reviews by students Course redesign to integrate changes; impact on the ability to implement best pedagogical practiceRates of admission to graduate schoolsSuccess in more advanced courses
Methods/techniques you ROUTINELY USE for gathering evidence of impact of technology-based innovations in T&L
Most difficult tasks associated with measurement were ranked as follows Knowing where to begin to measure the impact of technology-based innovations in T&LKnowing which measurement and evaluation techniques are most appropriateConducting the process of gathering evidenceKnowing the most effective way to analyze our evidenceCommunicating to stakeholders the results of our analysis
YesNoI have worked with evaluative data
Course level (in my own course) Course level (across several course sections)At the program level (math, English)At the degree level Across institution or several programsOther I have worked with evaluative data at the
Using evaluation tools effectively
Technologies and their connection/relationship toStudent engagement Learning related interactions Shrink the large class Improve student to faculty interaction Student retention and success Specific learning outcomes Remember
Triangulate to tell the full story. The impact of a curricular innovation should be “visible” from a variety of perspectives and measurement techniques. …..Three most commonly used evaluation tools: questionnaires (paper or online), interviews (individual or focus group), and observations (classroom or online).
5 StepsEstablish the goals of the evaluation: What do you want to learn? Determine your sample: Whom will you ask?Choose methodology: How will you ask? Create your instrument: What will you ask? Pre-test the instrument: Are you getting what you need? (PILOT YOUR TOOLS/STRATEGIES)
What is a good question?Significance: It addresses a question or issue that is seen as important and relevant to the communitySpecificity: The question focuses on specific objectivesAnswerability: The question can be answered by data collection and analysis;Connectedness: It’s linked to relevant research/theoryCoherency: It provides coherent explanations that rule out counter-interpretationsObjectivity: The question is free of biasWhom does your evidence need to persuade?
Quantitative. This approach starts with a hypothesis (or theory or strong idea), and seeks to confirm it. Qualitative.These studies start with data and look to discover the strong idea or hypothesis through data analysis. Mixed. This approach mixes the above methods, combining the confirmation of a hypothesis with data analysis and provides multiple perspectives on complex topics. Example: starting with a qualitative study to get data and identify the hypothesis and then following on with a quantitative study to confirm the hypothesis.
The Double Loop
Methods?Support in data collection?Double loop?
Using evidence to influence teaching practices
“higher education institutions seem to have a good understanding of the assessment process through the use of rubrics, e-portfolios, and other mechanisms, but the difficulty seems to be in improving the yield of the assessment processes, which is more of a political or institutional culture issue”
Why We MeasureInward (course level, inform teaching, evaluate technology use, reflective) Outward Share results with students Share results with potential students Share results with other faculty (in/out of discipline) Share results at the institutional or departmental level (info literacy, writing, cross course projects) Results can be a strategic advantage
Lessons from Wabash National Studya 3-year research and assessment projectprovides participating institutions extensive evidence about the teaching practices, student experiences, and institutional conditions that promote student growth across multiple outcomesInputs – the attitudes and values that students bring into collegeExperiences – the experiences that impact students once they are in collegeOutcomes – the impact that college has on student ability and knowledgehttp://www.liberalarts.wabash.edu/wabash-study-2010-overview/
Measuring student learning and experience is the easiest step in the assessment process. The real challenge begins once faculty, staff, administrators, and students at institutions try to use the evidence to improve student learning.www.learningoutcomesassessment.org/documents/Wabash_000.pdf
Lessons from Wabash National StudyFaulty assumptions about using evidence to improve:
lack of high-quality data is primary obstacle for using assessment evidence to promote improvements
providing detailed reports of findings is the key mechanism for kicking of a sequence of events culminating in evidence-based improvements
intellectual approach that faculty and staff use in their scholarship facilitates assessment projectsLessons from Wabash National StudyPerform audits of your institutions information about student learning and experienceSet aside resources for faculty, student & staff responses before assessment evidence is sharedDevelop communication plans to engage a range of campus representatives in data discussionsUse conversations to identify 1-2 outcomes on which to focus improvement effortsEngage students in helping to make sense of and form responses to assessment evidence.
Download the rubrics: http://www.aacu.org/value/rubrics/
http://www.qmprogram.org/researchResearch Grants: 2010 & 2011
Seeking Evidence of Impact: Answering "How Do We Know?"
Organizational Level Data Learner Satisfaction Student Learning Student satisfaction higher in QM reviewed courses & non-reviewed courses than in courses at non-QM institutions. (Aman dissertation, Oregon State, 2009)Course evaluation data showed student satisfaction increased in redesigned courses. (Prince George’s Community College, MD, 2005)Currently conducting a mixed methods study student & faculty perceptions of QM reviewed courses. (University of the Rockies)Grades improved with improvements in learner-content interaction (result of review). (Community College of Southern Maryland, 2005)Differences approaching significance on outcome measures.  (Swan, Matthews, Bogle, Boles, & Day,  University of Illinois/Springfield, 2010+)QM Rubric implementation positive effect on student higher-order cognitive presence & discussion forum grades via higher teaching presence. (Hall, Delgado Community College, LA, 2010)
Organizational Level Data Teacher LearningOrganizational Learning Use of QM design standards led to “development of a quality product, as defined by faculty, course designers, administrators, and students, primarily through faculty professional development and exposure to instructional design principles” (p. 214). (Greenberg dissertation, Ohio State, 2010)Currently utilizing TPACK framework to explain process by which new online teachers use the QM rubric and process when designing an online course. (University of Akron)There may be a carryover effect to non-reviewed courses when institution commits to the QM standards. (Aman dissertation, Oregon State,  2009)Faculty/design team respond different when QM presented as a rule rather than a guideline. (Greenberg dissertation, Ohio State, 2010)Extended positive impact on faculty developers & on members of review teams. (Preliminary analysis 2009; comprehensive summer 2011)
Alignment in the curriculum between course objectives, goals, and assessments Faculty members identify which assignments they have aligned with learning objectivesDesign rubrics or instructions to prompt them at various data-collection pointsDepartments or colleges are asked to report data online at the end of each term with prompts for comparison and reflectionDoing so makes the data ready for larger scale assessment effortsSession Recording and Resources: http://net.educause.edu/Program/1027812?PRODUCT_CODE=ELI113/GS12
What organizational mechanisms do you have in place to measure outcomes?
Reporting results
Match your research design to the type of information in which your anticipated consumers are interested or to which they will best respond. Match your data-collection method to the type of data to which your information consumer will respond or is likely to respect.
Keep it simple, to the point, and brief. Know who is consuming your data or research report, who the decision makers are, and how your data is being used to make which decisions, if any. Although time-consuming, it might be worthwhile to tailor your reports or analysis to the audience so as to emphasize certain findings or provide a deeper analysis on certain sections of interest.
Good research: Tips and tricks
Be careful of collecting too much dataBe aware of reaching the point at which you are no longer learning anything from the dataWrite up and analyze your data as soon as possibleConsider recording the interviews or your own observations/notesRecord interviews or focus groups--even your own observations or impressions immediately following the interaction
Besides all the usual good reasons for not reinventing the wheel and using others’ tested surveys, tools, or methods, doing so gives you a point of comparison for your own datahttp://www.educause.edu/Resources/ECARStudyofUndergraduateStuden/217333When collecting data, talk to the right people Don’t overscheduleBe sure to space out interviews, focus sessions, observations or other tactics so that you can get the most from your interactions
Guiding Questions or Next StepsWho are the key stakeholders for the innovative teaching and learning projects in which I am involved? How can I help faculty members communicate the results of their instructional innovations to a) students, b) administrators, and c) their professional communities?What “evidence” indicators do my key stakeholders value most (i.e., grades, satisfaction, retention, others)?Which research professionals or institutional research collection units can assist me in my data collection, analysis and reporting efforts?
Collecting CasesProject OverviewProject goals, context, and design Data collection methods Data analysis methods Findings Communication of results Influence on campus practices Reflection on Design, Methodology, and Effectiveness Project setup and design Project data collection and analysis Effectiveness and influence on campus practices Project Contacts Supporting Materials
Online Spring Focus SessionApril 2011http://net.educause.edu/eli113……….Read about the initiative: http://www.educause.edu/ELI/SEI……….Get involved: http://www.educause.edu/ELI/SeekingEvidenceofImpact/OpportunitiesforEngagement/206626……….
Join the ELI Evidence of Impact Constituent Grouphttp://www.educause.edu/cg/EVIDENCE-IMPACT
SEI Focus Session ContentThese items for the 2011 Online Spring Focus Session on seeking evidence of impact can be found at http://net.educause.edu/eli113. ELI Seeking Evidence of Impact Resource List, includes websites, reports, articles, and research: http://net.educause.edu/section_params/conf/ELI113/SFSResourceListAllFINAL.pdf. ELI Seeking Evidence of Impact Discussion Questions: http://net.educause.edu/section_params/conf/ELI113/discussion_prompts_team-indiv2011.doc. ELI Seeking Evidence of Impact Activity Workbook, Day 1 and 2: http://net.educause.edu/section_params/conf/ELI113/activity_prompts_team-indiv2011.doc. ELI Seeking Evidence of Impact Reflection Worksheet: http://net.educause.edu/section_params/conf/eli103/reflection_worksheet.doc. Presentation slides and resources for all sessions can be found at http://net.educause.edu/eli113/2011ELIOnlineSpringFocusSessionRecordings/1028384.

More Related Content

PPTX
Feedback to students about academic writing_INTEGRITY Project
PPT
Connecticut mesuring and modeling growth
PPT
Connecticut mesuring and modeling growth
PPT
Connecticut mesuring and modeling growth
DOCX
Transforming with Technology
PDF
ABLE - the NTU Student Dashboard - University of Derby
PPT
Week1 Assessment Overview
PPTX
Ajman University
Feedback to students about academic writing_INTEGRITY Project
Connecticut mesuring and modeling growth
Connecticut mesuring and modeling growth
Connecticut mesuring and modeling growth
Transforming with Technology
ABLE - the NTU Student Dashboard - University of Derby
Week1 Assessment Overview
Ajman University

What's hot (20)

PDF
NASPA AnP 2014
PPTX
Data analysis 2011
PPTX
Open Academic Analytics Initiative - Campus Technology Innovator Award Presen...
PPTX
22 January 2018 HEFCE open event “Using data to increase learning gains and t...
PPT
Ed Reform Lecture - University of Arkansas
PPT
Using tests for teacher evaluation texas
PPTX
Feedback as Dialogue
PPTX
The Virtuous Loop of Learning Analytics & Academic Technology Innovation
PPTX
Presentation to ResearchED London Sept 9th 2017
PPTX
High Stakes Standardize Testing Keck Pp 3
PPTX
Assessment literacy
PPTX
Dennis Small
PPTX
Program evaluation
PPTX
Expertise, Consumer-Oriented, and Program-Oriented Evaluation Approaches
PPTX
Using Learning Analytics to Assess Innovation & Improve Student Achievement
PPTX
Assessment literacy
PPTX
Teacher evaluation presentation3 mass
PPTX
Using Assessment data
PPTX
Closing the Gap With STEM Education: Why, What, and How
PPTX
What data from 3 million learners can tell us about effective course design
NASPA AnP 2014
Data analysis 2011
Open Academic Analytics Initiative - Campus Technology Innovator Award Presen...
22 January 2018 HEFCE open event “Using data to increase learning gains and t...
Ed Reform Lecture - University of Arkansas
Using tests for teacher evaluation texas
Feedback as Dialogue
The Virtuous Loop of Learning Analytics & Academic Technology Innovation
Presentation to ResearchED London Sept 9th 2017
High Stakes Standardize Testing Keck Pp 3
Assessment literacy
Dennis Small
Program evaluation
Expertise, Consumer-Oriented, and Program-Oriented Evaluation Approaches
Using Learning Analytics to Assess Innovation & Improve Student Achievement
Assessment literacy
Teacher evaluation presentation3 mass
Using Assessment data
Closing the Gap With STEM Education: Why, What, and How
What data from 3 million learners can tell us about effective course design
Ad

Viewers also liked (7)

PPT
The Training System Session 1
PPT
Blended by Design: Day 1
PPT
Assessment
PDF
Demonstrating the Impact of Faculty Development Activities
PPTX
IHC Faculty Development Program Plan AY 2013-14
PPTX
The blended learning research: What we now know about high quality faculty de...
PDF
Center-for-Excellence-in-Education-Return-on-Investment-for-Faculty-Developme...
The Training System Session 1
Blended by Design: Day 1
Assessment
Demonstrating the Impact of Faculty Development Activities
IHC Faculty Development Program Plan AY 2013-14
The blended learning research: What we now know about high quality faculty de...
Center-for-Excellence-in-Education-Return-on-Investment-for-Faculty-Developme...
Ad

Similar to Seeking Evidence of Impact: Answering "How Do We Know?" (20)

PPT
Jace Hargis Designing Online Teaching
PPT
this is a presentation on education, formative assessment
PPTX
Tools for evaluating the effectiveness of your teaching technique 1
PPT
Ubd Toolbox Powerpoint Presentation
PPTX
NTLT 2012 Peter Coolbear Keynote Presentation to Conference
DOC
An Inventory of Conventional and Technology-Enabled Direct and Indirect Asses...
DOCX
To help you explore ways of developing more powerful learnin.docx
PPT
PPT
Form = Function
PPT
Feedback on summative assessment group pres
PPT
Presentacion Acreditacion
DOCX
Assignment 2 Fink Step 3Due Week 7 and worth 200 pointsFor .docx
PPTX
Curriculum constrction sem i evaluation models
PPT
Online Course Assessment Part 1
PPT
Cluster workshop
PPTX
Data Driven Continuous Improvement
PDF
ENSAYO ARGUMENTATIVO.pdf de aprendizaje, competencias y evaluación
PDF
Tqf day 2 - assessment and feedback
PPTX
Theorizing Digital Learning
PPTX
Alison Bone - closing the loop
Jace Hargis Designing Online Teaching
this is a presentation on education, formative assessment
Tools for evaluating the effectiveness of your teaching technique 1
Ubd Toolbox Powerpoint Presentation
NTLT 2012 Peter Coolbear Keynote Presentation to Conference
An Inventory of Conventional and Technology-Enabled Direct and Indirect Asses...
To help you explore ways of developing more powerful learnin.docx
Form = Function
Feedback on summative assessment group pres
Presentacion Acreditacion
Assignment 2 Fink Step 3Due Week 7 and worth 200 pointsFor .docx
Curriculum constrction sem i evaluation models
Online Course Assessment Part 1
Cluster workshop
Data Driven Continuous Improvement
ENSAYO ARGUMENTATIVO.pdf de aprendizaje, competencias y evaluación
Tqf day 2 - assessment and feedback
Theorizing Digital Learning
Alison Bone - closing the loop

More from EDUCAUSE (20)

PPTX
Mentoring for Today’s Generation(s) at Scale: Virtual and Face-to-Face
PPTX
ELI Town Hall and First-Timer’s Meeting
PPTX
Current Trends in Educational Technology
PPTX
Developing a Digital Badge Roadmap
PPTX
Digital Badging for Teaching and Learning
PPTX
ELI 2017 Town Hall Meeting
PPTX
Eli2017 Newcomer Orientation
PPTX
3 Emerging Strategies to Advance Professional Learning in Digital Environments
PDF
Emerging Strategies to Leverage Disruptive Education Technologies
PPTX
Toward Student Engagement and Recognition: Developing a Digital Badge Roadmap
PPTX
Learn How Emergent Online Models Serve as Innovation Incubators
PDF
Mobile Teaching And Learning: Engaging Students And Measuring Impact
PPTX
Badges for Teaching and Learning
PDF
Working Successfully with Emerging Technologies and Innovations
PDF
The MOOC in Review: Contributions to Teaching and Learning
PPTX
gmac2011
PPTX
Mba2011
PPTX
Mecar2010
PPTX
townhall
PPTX
blend10
Mentoring for Today’s Generation(s) at Scale: Virtual and Face-to-Face
ELI Town Hall and First-Timer’s Meeting
Current Trends in Educational Technology
Developing a Digital Badge Roadmap
Digital Badging for Teaching and Learning
ELI 2017 Town Hall Meeting
Eli2017 Newcomer Orientation
3 Emerging Strategies to Advance Professional Learning in Digital Environments
Emerging Strategies to Leverage Disruptive Education Technologies
Toward Student Engagement and Recognition: Developing a Digital Badge Roadmap
Learn How Emergent Online Models Serve as Innovation Incubators
Mobile Teaching And Learning: Engaging Students And Measuring Impact
Badges for Teaching and Learning
Working Successfully with Emerging Technologies and Innovations
The MOOC in Review: Contributions to Teaching and Learning
gmac2011
Mba2011
Mecar2010
townhall
blend10

Recently uploaded (20)

PDF
Basic Mud Logging Guide for educational purpose
PDF
FourierSeries-QuestionsWithAnswers(Part-A).pdf
PPTX
Renaissance Architecture: A Journey from Faith to Humanism
PDF
Pre independence Education in Inndia.pdf
PPTX
Pharmacology of Heart Failure /Pharmacotherapy of CHF
PDF
Chapter 2 Heredity, Prenatal Development, and Birth.pdf
PPTX
Cell Structure & Organelles in detailed.
PDF
Insiders guide to clinical Medicine.pdf
PPTX
PPH.pptx obstetrics and gynecology in nursing
PDF
Abdominal Access Techniques with Prof. Dr. R K Mishra
PDF
RMMM.pdf make it easy to upload and study
PDF
Business Ethics Teaching Materials for college
PPTX
Introduction_to_Human_Anatomy_and_Physiology_for_B.Pharm.pptx
PPTX
human mycosis Human fungal infections are called human mycosis..pptx
PPTX
Final Presentation General Medicine 03-08-2024.pptx
PDF
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
PDF
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
PPTX
IMMUNITY IMMUNITY refers to protection against infection, and the immune syst...
PPTX
master seminar digital applications in india
PDF
Module 4: Burden of Disease Tutorial Slides S2 2025
Basic Mud Logging Guide for educational purpose
FourierSeries-QuestionsWithAnswers(Part-A).pdf
Renaissance Architecture: A Journey from Faith to Humanism
Pre independence Education in Inndia.pdf
Pharmacology of Heart Failure /Pharmacotherapy of CHF
Chapter 2 Heredity, Prenatal Development, and Birth.pdf
Cell Structure & Organelles in detailed.
Insiders guide to clinical Medicine.pdf
PPH.pptx obstetrics and gynecology in nursing
Abdominal Access Techniques with Prof. Dr. R K Mishra
RMMM.pdf make it easy to upload and study
Business Ethics Teaching Materials for college
Introduction_to_Human_Anatomy_and_Physiology_for_B.Pharm.pptx
human mycosis Human fungal infections are called human mycosis..pptx
Final Presentation General Medicine 03-08-2024.pptx
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
IMMUNITY IMMUNITY refers to protection against infection, and the immune syst...
master seminar digital applications in india
Module 4: Burden of Disease Tutorial Slides S2 2025

Seeking Evidence of Impact: Answering "How Do We Know?"

  • 1. Veronica Diaz, PhDAssociate DirectorEDUCAUSE Learning Initiative, EDUCAUSE:::League for InnovationLearning College Summit, Phoenix, AZSeeking Evidence of Impact: Answering "How Do We Know"
  • 2. Today’s TalkReview what it is to seek impact of (teaching and learning innovations) Consider some strategies for using evaluation tools effectively Determine ways to use evidence to influence teaching practices Review ways to report results
  • 3. Academic instructionFaculty developmentInstructional technologyInstructional designLibraryInformation technologySenior administration Other Who are we?
  • 4. Why are we here?
  • 5. I am working in evaluating T&L innovations now. Evaluation of T&L is part of my formal job description.My campus units director or VP mandates our gathering evidence of impact. A senior member of the administration (dean, president, senior vice-president) mandates gathering evidence of impact in T&L. I am working as part of a team to gather evidence.Accreditation processes are prompting us to begin measurement work.
  • 9. What the community saidDownload the Surveyhttps://docs.google.com/document/d/1Yj37DINUdCyk5DXPelr0FJ1Rewfx8xON-cLo-NJ5e4U/edit?hl=en_US#
  • 10. Technologies to MeasureWeb conferencing LMS and individual features Lecture capture Mobile learning tools (laptops, ebooks, tablets) ClickersCollaborative tools Student generated contentWeb 2.0 and social networking technologies Learning spaces OERPersonal learning environments Online learning: hyflex course design, blended learning programs, synchronous/asynchronous delivery modes, fully online programs EportfoliosMultimedia projects and tools: pod/vod casts Simulations Early alert systems Cross-curricular information literacy programsLarge course redesigns
  • 11. Technologies and their connection/relationship to…Student engagement Learning related interactions Shrink the large class Improve student to faculty interaction Student retention and success Specific learning outcomes
  • 12. 123 most important indicators you use to measure the evidence of impact of technology-based innovations in T&L
  • 13. What is “evidence?”Grades (was frequently mentioned) Learning outcomes (was frequently mentioned) SatisfactionSkills Improved course evaluations Measures of engagement and participation Retention/enrollment rates Graduation rate Direct measures of student performance (at the course level and cumulative) Interview dataInstitutional data Faculty/student technology utilization ratesData on student/faculty facility and satisfaction with using technologySuccessfully implementing technology Job placement Student artifacts Better faculty reviews by students Course redesign to integrate changes; impact on the ability to implement best pedagogical practiceRates of admission to graduate schoolsSuccess in more advanced courses
  • 14. Methods/techniques you ROUTINELY USE for gathering evidence of impact of technology-based innovations in T&L
  • 15. Most difficult tasks associated with measurement were ranked as follows Knowing where to begin to measure the impact of technology-based innovations in T&LKnowing which measurement and evaluation techniques are most appropriateConducting the process of gathering evidenceKnowing the most effective way to analyze our evidenceCommunicating to stakeholders the results of our analysis
  • 16. YesNoI have worked with evaluative data
  • 17. Course level (in my own course) Course level (across several course sections)At the program level (math, English)At the degree level Across institution or several programsOther I have worked with evaluative data at the
  • 18. Using evaluation tools effectively
  • 19. Technologies and their connection/relationship toStudent engagement Learning related interactions Shrink the large class Improve student to faculty interaction Student retention and success Specific learning outcomes Remember
  • 20. Triangulate to tell the full story. The impact of a curricular innovation should be “visible” from a variety of perspectives and measurement techniques. …..Three most commonly used evaluation tools: questionnaires (paper or online), interviews (individual or focus group), and observations (classroom or online).
  • 21. 5 StepsEstablish the goals of the evaluation: What do you want to learn? Determine your sample: Whom will you ask?Choose methodology: How will you ask? Create your instrument: What will you ask? Pre-test the instrument: Are you getting what you need? (PILOT YOUR TOOLS/STRATEGIES)
  • 22. What is a good question?Significance: It addresses a question or issue that is seen as important and relevant to the communitySpecificity: The question focuses on specific objectivesAnswerability: The question can be answered by data collection and analysis;Connectedness: It’s linked to relevant research/theoryCoherency: It provides coherent explanations that rule out counter-interpretationsObjectivity: The question is free of biasWhom does your evidence need to persuade?
  • 23. Quantitative. This approach starts with a hypothesis (or theory or strong idea), and seeks to confirm it. Qualitative.These studies start with data and look to discover the strong idea or hypothesis through data analysis. Mixed. This approach mixes the above methods, combining the confirmation of a hypothesis with data analysis and provides multiple perspectives on complex topics. Example: starting with a qualitative study to get data and identify the hypothesis and then following on with a quantitative study to confirm the hypothesis.
  • 25. Methods?Support in data collection?Double loop?
  • 26. Using evidence to influence teaching practices
  • 27. “higher education institutions seem to have a good understanding of the assessment process through the use of rubrics, e-portfolios, and other mechanisms, but the difficulty seems to be in improving the yield of the assessment processes, which is more of a political or institutional culture issue”
  • 28. Why We MeasureInward (course level, inform teaching, evaluate technology use, reflective) Outward Share results with students Share results with potential students Share results with other faculty (in/out of discipline) Share results at the institutional or departmental level (info literacy, writing, cross course projects) Results can be a strategic advantage
  • 29. Lessons from Wabash National Studya 3-year research and assessment projectprovides participating institutions extensive evidence about the teaching practices, student experiences, and institutional conditions that promote student growth across multiple outcomesInputs – the attitudes and values that students bring into collegeExperiences – the experiences that impact students once they are in collegeOutcomes – the impact that college has on student ability and knowledgehttp://www.liberalarts.wabash.edu/wabash-study-2010-overview/
  • 30. Measuring student learning and experience is the easiest step in the assessment process. The real challenge begins once faculty, staff, administrators, and students at institutions try to use the evidence to improve student learning.www.learningoutcomesassessment.org/documents/Wabash_000.pdf
  • 31. Lessons from Wabash National StudyFaulty assumptions about using evidence to improve:
  • 32. lack of high-quality data is primary obstacle for using assessment evidence to promote improvements
  • 33. providing detailed reports of findings is the key mechanism for kicking of a sequence of events culminating in evidence-based improvements
  • 34. intellectual approach that faculty and staff use in their scholarship facilitates assessment projectsLessons from Wabash National StudyPerform audits of your institutions information about student learning and experienceSet aside resources for faculty, student & staff responses before assessment evidence is sharedDevelop communication plans to engage a range of campus representatives in data discussionsUse conversations to identify 1-2 outcomes on which to focus improvement effortsEngage students in helping to make sense of and form responses to assessment evidence.
  • 35. Download the rubrics: http://www.aacu.org/value/rubrics/
  • 38. Organizational Level Data Learner Satisfaction Student Learning Student satisfaction higher in QM reviewed courses & non-reviewed courses than in courses at non-QM institutions. (Aman dissertation, Oregon State, 2009)Course evaluation data showed student satisfaction increased in redesigned courses. (Prince George’s Community College, MD, 2005)Currently conducting a mixed methods study student & faculty perceptions of QM reviewed courses. (University of the Rockies)Grades improved with improvements in learner-content interaction (result of review). (Community College of Southern Maryland, 2005)Differences approaching significance on outcome measures. (Swan, Matthews, Bogle, Boles, & Day, University of Illinois/Springfield, 2010+)QM Rubric implementation positive effect on student higher-order cognitive presence & discussion forum grades via higher teaching presence. (Hall, Delgado Community College, LA, 2010)
  • 39. Organizational Level Data Teacher LearningOrganizational Learning Use of QM design standards led to “development of a quality product, as defined by faculty, course designers, administrators, and students, primarily through faculty professional development and exposure to instructional design principles” (p. 214). (Greenberg dissertation, Ohio State, 2010)Currently utilizing TPACK framework to explain process by which new online teachers use the QM rubric and process when designing an online course. (University of Akron)There may be a carryover effect to non-reviewed courses when institution commits to the QM standards. (Aman dissertation, Oregon State, 2009)Faculty/design team respond different when QM presented as a rule rather than a guideline. (Greenberg dissertation, Ohio State, 2010)Extended positive impact on faculty developers & on members of review teams. (Preliminary analysis 2009; comprehensive summer 2011)
  • 40. Alignment in the curriculum between course objectives, goals, and assessments Faculty members identify which assignments they have aligned with learning objectivesDesign rubrics or instructions to prompt them at various data-collection pointsDepartments or colleges are asked to report data online at the end of each term with prompts for comparison and reflectionDoing so makes the data ready for larger scale assessment effortsSession Recording and Resources: http://net.educause.edu/Program/1027812?PRODUCT_CODE=ELI113/GS12
  • 41. What organizational mechanisms do you have in place to measure outcomes?
  • 43. Match your research design to the type of information in which your anticipated consumers are interested or to which they will best respond. Match your data-collection method to the type of data to which your information consumer will respond or is likely to respect.
  • 44. Keep it simple, to the point, and brief. Know who is consuming your data or research report, who the decision makers are, and how your data is being used to make which decisions, if any. Although time-consuming, it might be worthwhile to tailor your reports or analysis to the audience so as to emphasize certain findings or provide a deeper analysis on certain sections of interest.
  • 45. Good research: Tips and tricks
  • 46. Be careful of collecting too much dataBe aware of reaching the point at which you are no longer learning anything from the dataWrite up and analyze your data as soon as possibleConsider recording the interviews or your own observations/notesRecord interviews or focus groups--even your own observations or impressions immediately following the interaction
  • 47. Besides all the usual good reasons for not reinventing the wheel and using others’ tested surveys, tools, or methods, doing so gives you a point of comparison for your own datahttp://www.educause.edu/Resources/ECARStudyofUndergraduateStuden/217333When collecting data, talk to the right people Don’t overscheduleBe sure to space out interviews, focus sessions, observations or other tactics so that you can get the most from your interactions
  • 48. Guiding Questions or Next StepsWho are the key stakeholders for the innovative teaching and learning projects in which I am involved? How can I help faculty members communicate the results of their instructional innovations to a) students, b) administrators, and c) their professional communities?What “evidence” indicators do my key stakeholders value most (i.e., grades, satisfaction, retention, others)?Which research professionals or institutional research collection units can assist me in my data collection, analysis and reporting efforts?
  • 49. Collecting CasesProject OverviewProject goals, context, and design Data collection methods Data analysis methods Findings Communication of results Influence on campus practices Reflection on Design, Methodology, and Effectiveness Project setup and design Project data collection and analysis Effectiveness and influence on campus practices Project Contacts Supporting Materials
  • 50. Online Spring Focus SessionApril 2011http://net.educause.edu/eli113……….Read about the initiative: http://www.educause.edu/ELI/SEI……….Get involved: http://www.educause.edu/ELI/SeekingEvidenceofImpact/OpportunitiesforEngagement/206626……….
  • 51. Join the ELI Evidence of Impact Constituent Grouphttp://www.educause.edu/cg/EVIDENCE-IMPACT
  • 52. SEI Focus Session ContentThese items for the 2011 Online Spring Focus Session on seeking evidence of impact can be found at http://net.educause.edu/eli113. ELI Seeking Evidence of Impact Resource List, includes websites, reports, articles, and research: http://net.educause.edu/section_params/conf/ELI113/SFSResourceListAllFINAL.pdf. ELI Seeking Evidence of Impact Discussion Questions: http://net.educause.edu/section_params/conf/ELI113/discussion_prompts_team-indiv2011.doc. ELI Seeking Evidence of Impact Activity Workbook, Day 1 and 2: http://net.educause.edu/section_params/conf/ELI113/activity_prompts_team-indiv2011.doc. ELI Seeking Evidence of Impact Reflection Worksheet: http://net.educause.edu/section_params/conf/eli103/reflection_worksheet.doc. Presentation slides and resources for all sessions can be found at http://net.educause.edu/eli113/2011ELIOnlineSpringFocusSessionRecordings/1028384.
  • 53. Other Related Resources Focus Session Learning Commons: http://elifocus.ning.com/Full focus session online program: http://net.educause.edu/Program/1027810ELI Seeking Evidence of Impact initiative site: http://www.educause.edu/ELI/SEIResource site: http://www.educause.edu/ELI/SeekingEvidenceofImpact/Resources/206625Suggest an additional resource: http://tinyurl.com/resourceideaGet involved: http://www.educause.edu/ELI/SeekingEvidenceofImpact/OpportunitiesforEngagement/206626Contribute: http://tinyurl.com/elisei
  • 54. Contact InformationVeronica M. Diaz, PhDAssociate DirectorEDUCAUSE Learning Initiativevdiaz@educause.eduCopyright Veronica Diaz, 2011. This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial, educational purposes, provided that this copyright statement appears on the reproduced materials and notice is given that the copying is by permission of the author. To disseminate otherwise or to republish requires written permission from the author.