SlideShare a Scribd company logo
Assessing program-level learning outcomes: theory and practice Presentation for Los Angeles Community College District October 16, 2009 Arend Flick Assoc. Professor, English
A reminder: why we’re here . . . “ It is extremely difficult to argue as a responsible academic that it is wrong to gather information about a phenomenon, or that it is inappropriate to use the results for collective betterment.”  (Peter Ewell) “ Seasoned observers have pointed out the irony of the academy, as an institution dedicated to discerning the truth through evidence, being so seemingly resistant to measuring quality through evidence.”  (American Association of State Colleges and Universities)
What you should know and be able to do after this session . . . You should be able to write learning outcomes for a program.  You should become more familiar with methods (direct and indirect) available to you to assess learning in a particular program. You should become more familiar with how you might make assessment of programs a systematic part of your college’s processes.  You will have produced draft program-level outcomes statements and draft assessment plans for modification and (eventually) adoption by your colleges.
Why Assess Programs? To demonstrate that they work (i.e., for accountability purposes) To help in planning, resource allocation, etc. To identify problem areas (e.g., misalignment of courses with program goals) that can lead to improvement
What  Is  a Program? Title V (section 55000) says it’s “an organized sequence of courses leading to a defined objective, a degree, a certificate, a diploma, a license, or transfer to another institution of higher education.” Title V also stipulates that 18 semester hours are required at a minimum.
Which means . . . LACCD has literally  hundreds  of programs to assess (partly because you have so many majors). The largest program at any of your colleges is general education. But you also have other interdisciplinary programs (e.g. honors, Puente, basic skills, study abroad, IGETC, etc.) Your career-tech programs are already probably far along in the process of assessing learning (even if they don’t know it).
And Means . . . We need to find ways of identifying and assessing program-level SLOs that  are not unduly burdensome (or intrusive), lead to improvement in teaching and learning, satisfy increasingly rigorous accountability demands, assist in planning processes, AND meet ACCJC standards.
Why program-level assessment is the greatest challenge for CCs So  many  programs (and many “phantom” programs) So many  inter-disciplinary  programs, often with no one directly responsible for their ensuring their effectiveness Assessment methods that work well for university programs often don’t work as well for us
Assessing Program-Level Outcomes Define the program’s learning outcomes (what we want students to be able to think, do, or know when they’ve completed it). Check for alignment between curriculum and outcomes. Develop an assessment plan. Collect assessment data. Use this information for improvement, for planning, for accountability, for resource allocation. Routinely examine the assessment process itself. adapted in part from Mary J. Allen's work at CSU Bakersfield
What Are SLOs?  (a Refresher) They can be defined at any instructional level, ranging from the specific lesson all the way up to the institution. As opposed to “objectives,” they emphasize  application  of knowledge (what a student can do at the end of a course of instruction). They are not discrete or highly specific skills but aggregated complexes of knowledge, ability, and attitudes.  They represent the broadest goals of the course or program.
Program-Level SLO Suggestions The fewer the better. Don’t get too hung up on language--this is only the first (and by far the simplest) part of the assessment cycle, and you can change/refine SLOs later as necessary. Make sure the outcome is something that can be assessed. Work collaboratively as much as possible. Solicit advice from advisory groups, licensing/accrediting boards, etc.—and do a google or ERIC search of counterpart program SLOs at other community colleges Think about course-program alignment, but recognize that some course SLOs won’t map to program SLOs (and some program SLOs will depend on OLEs for achievement).
Some Examples of Programmatic SLOs Computer/Business Applications Certificate: Productively work as a team member with people with diverse backgrounds in a workplace environment. Communicate effectively in support of a business office, including production and design of complex electronic and paper-based correspondence and documents. Use the Internet, a wide variety of computer application and standard business procedures to compute, analyze business performance, and solve problems. Actively assist in implementing general office procedures, including records management Demonstrate high qualities of self-management and self-awareness in terms of workshop responsibility and productivity.
Music Demonstrate understanding of the fundamental melodic, harmonic, and rhythmic structure of music. Demonstrate fluency with the language of music in written and aural form. Perform on an instrument (or voice) at college sophomore level. Perform effectively in a musical ensemble. Use the piano keyboard to demonstrate and apply musical concepts. Demonstrate understanding of the historical development of music.
English 80% of a sample of graduating English majors in a literature survey course will be able to score at least 70% on a test designed to measure their success in identifying authors, in placing them in their historical periods, and in knowing the titles of their major works.”  (an operational SLO) At graduation, students are able to write a clear, coherent, and persuasive essay demonstrating their ability to analyze and interpret texts, to apply secondary criticism to them, and to explain their contexts.
Some Practical Advice in Writing Programmatic SLOs Identify the most important things a student should leave your program being able TO DO (or know).  Address student competency rather than content coverage.  (Try for no more than five SLOs.) Consider: course SLOs and major assignments; transfer alignment needs, external licensure/accreditation requirements; employer needs; alumni feedback. Use active verbs to craft sentences that are clear, intelligible to students. Ensure that the SLO is assessable, measurable. Share draft SLO with colleagues to sharpen focus.
Some hands on Using the worksheet, let’s spend 20 minutes or so trying to develop a short (but comprehensive) list of outcomes for a degree or certificate in our discipline.  In the last five minutes, let’s trade with someone outside the discipline for feedback.
Program SLO Checklist Outcomes are written using action verbs. The language indicates the program’s big picture, not nuts & bolts. Outcomes describe what students can DO, not what the program’s goals are.  They address student competency (how they apply what they’ve learned) rather than content coverage.
Once We Have Programmatic SLOs . . . Where should they appear? In the college catalog? On the college website (and any program-focused web pages)? On brochures, posters, etc. that describe our programs? Other?
Some  direct  (i.e., performance-based) methods to assess learning in programs Look at work that students are already doing in courses to determine if, and to what extent, it demonstrates their achievement of program-level competencies. Administer nationally normed (or locally developed) tests of program-level competencies. Have students reflect on their values, attitudes, and beliefs  (if developing these is an intended outcome of the program) In C-T programs, have employers rate skills of recent graduates. In C-T programs, use scores and pass rates on appropriate licensure exams that can be aligned with program SLOs.
Using student work produced in classes to assess programs In  capstone courses  (or de facto capstone courses) Through  portfolios , either  pape r  or  electronic  (i.e., by taking a second look at essays, projects, presentations, etc. that students are already doing in courses) Through individual faculty assessments at the course level, either informally or through an  assessment data management system As a beginning, through a course-program assessment matrix
A course-program assessment matrix If you’ve defined (and are assessing) course-level SLOs, you can use a matrix to gather information about course-level learning that potentially maps to program-level competencies. Two caveats: A student may not be  in  a specific program just because she is taking a class  required  by that program.  (Most “GE” courses serve three or more “programs” simultaneously.) Students may achieve some program-level outcomes through co-curricular activities, not coursework.
 
Some problems with direct methods to assess programs They are often labor intensive They often don’t allow for cross-institutional comparison They too often tacitly depend on the judgments of individual instructors, working in isolation from each other, about their own students In the case of externally designed standardized tests, instructors may mistrust results
Some  indirect  methods to assess learning in programs Student surveys of self-perceived  learning gains Student engagement surveys (e.g., the  CCSSE ) Alumni surveys Focus groups (of exiting students, alumni, etc.) Faculty surveys For C-T courses, surveys of or interviews with employers For C-T courses, employment data IR data on student retention and success in programs (though these can be deceptive indicators of “student learning.”)
A note on student  self-reports They are likely to be moderately valid measures of student learning when: The information requested is known to students. Questions are phrased clearly and unambiguously. Questions refer to recent activities. Students think questions merit a serious and thoughtful response. Questions don’t threaten or embarrass the student. G.D. Kuh, "Using student and alumni surveys for accountability in higher education" (2005)
Developing an assessment plan Some models: Riverside Community College District Cabrillo College Community College of Baltimore County University of South Alabama  (using TracDat)
Some features of good systematic program-level assessment Units are expected to undertake PLA assessment cycles  regularly  and report results  annually .  (Reports include all Nichols-column information: 1) SLO(s) assessed, 2) assessment method(s) employed, 3) brief description of data generated by assessment, 4) how results were used for improvement.) Assessment reports are  read  and  used  as part of strategic planning processes.  Resource allocation decisions take assessment results into consideration (e.g., requests for new technology are fulfilled on the basis of how it has been shown to improve learning.) It is a component of the program review process, which itself is a component of planning processes.
Reporting assessment results
Some final thoughts on making this work for us .  . . Annual assessment updates should be short, in a standardized format, and preferably organized into a searchable database. A routine  expectation  of administrators and faculty is that they participate regularly in the assessment process—with faculty preferably working collaboratively to define and assess outcomes, and use results for improvement. Hold off (for now) on interdisciplinary program assessment, but eventually assign responsibility (maybe to “coordinating disciplines”). New programs should have SLOs and assessment plans as a condition of approval.  Planners need to use assessment results and program reviews to define what the institution does well (and not well).  Their decisions need to be driven by empirical evidence related to student learning.

More Related Content

PDF
PDF
Module 4: Assessment of D.E Needs
PPTX
Models for Online, Open, Flexible And Technology Enhanced Higher Education – ...
PPTX
Portfolio and student reflection
PDF
Portfolio Guidelines 7 15 05
PPT
Intro to rubrics
PPT
Electronic portfolios for students
PPT
Inquiry: Preparing for BTSA Advice & Assistance
Module 4: Assessment of D.E Needs
Models for Online, Open, Flexible And Technology Enhanced Higher Education – ...
Portfolio and student reflection
Portfolio Guidelines 7 15 05
Intro to rubrics
Electronic portfolios for students
Inquiry: Preparing for BTSA Advice & Assistance

What's hot (19)

PPT
Common standard 2
PPTX
Learning Outcomes
PPT
Exploring eportfolios for students
PPTX
Portfolios
PPTX
Portfolios
PPTX
Class 10 Portfolio
PPT
EFL Portfolios
PPTX
Learning outcomes what why and how
PDF
Assessing Student Learning
PPT
Capstone Assessment Project
PPTX
Portfolio For Assessment
PPTX
The Case Study Initiative: Embedding the Entrepreneurial Mindset - Bill Riley...
PPTX
Portfolios
PPTX
Purpose of the e portfolio
PDF
The Individual Development Plan for Postdoctoral Professional Development
PPT
Viewpoints Assessment and Feedback workshop 25th March 2011
DOCX
Coh 499 signature assignment capstone project public health com
DOC
An Inventory of Conventional and Technology-Enabled Direct and Indirect Asses...
PDF
Managing Multiple Programs Building relationships and taking on challenges; o...
Common standard 2
Learning Outcomes
Exploring eportfolios for students
Portfolios
Portfolios
Class 10 Portfolio
EFL Portfolios
Learning outcomes what why and how
Assessing Student Learning
Capstone Assessment Project
Portfolio For Assessment
The Case Study Initiative: Embedding the Entrepreneurial Mindset - Bill Riley...
Portfolios
Purpose of the e portfolio
The Individual Development Plan for Postdoctoral Professional Development
Viewpoints Assessment and Feedback workshop 25th March 2011
Coh 499 signature assignment capstone project public health com
An Inventory of Conventional and Technology-Enabled Direct and Indirect Asses...
Managing Multiple Programs Building relationships and taking on challenges; o...
Ad

Similar to Laccd Program Assessment (20)

PPTX
Introduction to Designing Assessment Plans Workshop 1
PPTX
Reimagining and Reinforcing Student Success Into Career Success Across the Cu...
PDF
M2_Program Course and Intended Learning Outcomes_1.pdf
PPT
Student Learning Outcomes
PPT
"CMC Moving Ahead": Assessment In-Service 2010
 
PPTX
OBE Introduction for students OBE Introduction for students
PPT
Assessment Presentation for Faculty Panel at CMC
 
PPT
Learning Assessment of the Students by Faculties in HEIs.
PPT
Assessment of Student Learning
PPT
Assessmentt Report A&C To Msche Final
PPTX
National university assessment process
PPT
Assessment and evaluation of learning plan
PPTX
Grading and assessment presentation
PPTX
Ldp presentation -assessment
PPTX
Justice project conference presentation
PPT
Integrated Student Centered Curriculum
PPTX
OBE FRAMEWORK with Methodologies - CHED.pptx
PPTX
Writing-Learning-Outcomes.pptx
PPTX
Time for a Paradigm Shift?
PDF
TSchehr_Assessing the Assessment_AIR
Introduction to Designing Assessment Plans Workshop 1
Reimagining and Reinforcing Student Success Into Career Success Across the Cu...
M2_Program Course and Intended Learning Outcomes_1.pdf
Student Learning Outcomes
"CMC Moving Ahead": Assessment In-Service 2010
 
OBE Introduction for students OBE Introduction for students
Assessment Presentation for Faculty Panel at CMC
 
Learning Assessment of the Students by Faculties in HEIs.
Assessment of Student Learning
Assessmentt Report A&C To Msche Final
National university assessment process
Assessment and evaluation of learning plan
Grading and assessment presentation
Ldp presentation -assessment
Justice project conference presentation
Integrated Student Centered Curriculum
OBE FRAMEWORK with Methodologies - CHED.pptx
Writing-Learning-Outcomes.pptx
Time for a Paradigm Shift?
TSchehr_Assessing the Assessment_AIR
Ad

More from harrindl (18)

PPTX
Bsili opening
PPT
Linksv foster city
PPTX
District summit asap power point
PPT
Necc atd lacc_presentation_final_b
PPTX
Statway laccd summit
PDF
Das presentation 2011
PPTX
Building stronger systems for sl os and program review
PPT
At d & data presentation
PPT
Bsi leadership for student success what matters_most_2010
PPT
June 20 2010 bsi christie
PPTX
Lessons learnedfor bsi
PPT
Pipeline math
PPT
Pipelinebasicskillsmath2
PPT
Motivation and self regulation--Myron Dembo
PPT
Classroom Research
PPTX
Institution As Learner
PPT
Action Plan Presentation.6.16.09
PPT
Apg Presentation
Bsili opening
Linksv foster city
District summit asap power point
Necc atd lacc_presentation_final_b
Statway laccd summit
Das presentation 2011
Building stronger systems for sl os and program review
At d & data presentation
Bsi leadership for student success what matters_most_2010
June 20 2010 bsi christie
Lessons learnedfor bsi
Pipeline math
Pipelinebasicskillsmath2
Motivation and self regulation--Myron Dembo
Classroom Research
Institution As Learner
Action Plan Presentation.6.16.09
Apg Presentation

Recently uploaded (20)

PDF
Module 4: Burden of Disease Tutorial Slides S2 2025
PDF
Supply Chain Operations Speaking Notes -ICLT Program
PDF
01-Introduction-to-Information-Management.pdf
PDF
2.FourierTransform-ShortQuestionswithAnswers.pdf
PDF
LNK 2025 (2).pdf MWEHEHEHEHEHEHEHEHEHEHE
PDF
OBE - B.A.(HON'S) IN INTERIOR ARCHITECTURE -Ar.MOHIUDDIN.pdf
PDF
RMMM.pdf make it easy to upload and study
PDF
Practical Manual AGRO-233 Principles and Practices of Natural Farming
PPTX
Radiologic_Anatomy_of_the_Brachial_plexus [final].pptx
PDF
Paper A Mock Exam 9_ Attempt review.pdf.
PDF
Computing-Curriculum for Schools in Ghana
PDF
Updated Idioms and Phrasal Verbs in English subject
PPTX
Introduction-to-Literarature-and-Literary-Studies-week-Prelim-coverage.pptx
PDF
Complications of Minimal Access Surgery at WLH
PPTX
UNIT III MENTAL HEALTH NURSING ASSESSMENT
PPTX
Orientation - ARALprogram of Deped to the Parents.pptx
PDF
Yogi Goddess Pres Conference Studio Updates
PPTX
Cell Types and Its function , kingdom of life
PPTX
Microbial diseases, their pathogenesis and prophylaxis
PDF
Chapter 2 Heredity, Prenatal Development, and Birth.pdf
Module 4: Burden of Disease Tutorial Slides S2 2025
Supply Chain Operations Speaking Notes -ICLT Program
01-Introduction-to-Information-Management.pdf
2.FourierTransform-ShortQuestionswithAnswers.pdf
LNK 2025 (2).pdf MWEHEHEHEHEHEHEHEHEHEHE
OBE - B.A.(HON'S) IN INTERIOR ARCHITECTURE -Ar.MOHIUDDIN.pdf
RMMM.pdf make it easy to upload and study
Practical Manual AGRO-233 Principles and Practices of Natural Farming
Radiologic_Anatomy_of_the_Brachial_plexus [final].pptx
Paper A Mock Exam 9_ Attempt review.pdf.
Computing-Curriculum for Schools in Ghana
Updated Idioms and Phrasal Verbs in English subject
Introduction-to-Literarature-and-Literary-Studies-week-Prelim-coverage.pptx
Complications of Minimal Access Surgery at WLH
UNIT III MENTAL HEALTH NURSING ASSESSMENT
Orientation - ARALprogram of Deped to the Parents.pptx
Yogi Goddess Pres Conference Studio Updates
Cell Types and Its function , kingdom of life
Microbial diseases, their pathogenesis and prophylaxis
Chapter 2 Heredity, Prenatal Development, and Birth.pdf

Laccd Program Assessment

  • 1. Assessing program-level learning outcomes: theory and practice Presentation for Los Angeles Community College District October 16, 2009 Arend Flick Assoc. Professor, English
  • 2. A reminder: why we’re here . . . “ It is extremely difficult to argue as a responsible academic that it is wrong to gather information about a phenomenon, or that it is inappropriate to use the results for collective betterment.” (Peter Ewell) “ Seasoned observers have pointed out the irony of the academy, as an institution dedicated to discerning the truth through evidence, being so seemingly resistant to measuring quality through evidence.” (American Association of State Colleges and Universities)
  • 3. What you should know and be able to do after this session . . . You should be able to write learning outcomes for a program. You should become more familiar with methods (direct and indirect) available to you to assess learning in a particular program. You should become more familiar with how you might make assessment of programs a systematic part of your college’s processes. You will have produced draft program-level outcomes statements and draft assessment plans for modification and (eventually) adoption by your colleges.
  • 4. Why Assess Programs? To demonstrate that they work (i.e., for accountability purposes) To help in planning, resource allocation, etc. To identify problem areas (e.g., misalignment of courses with program goals) that can lead to improvement
  • 5. What Is a Program? Title V (section 55000) says it’s “an organized sequence of courses leading to a defined objective, a degree, a certificate, a diploma, a license, or transfer to another institution of higher education.” Title V also stipulates that 18 semester hours are required at a minimum.
  • 6. Which means . . . LACCD has literally hundreds of programs to assess (partly because you have so many majors). The largest program at any of your colleges is general education. But you also have other interdisciplinary programs (e.g. honors, Puente, basic skills, study abroad, IGETC, etc.) Your career-tech programs are already probably far along in the process of assessing learning (even if they don’t know it).
  • 7. And Means . . . We need to find ways of identifying and assessing program-level SLOs that are not unduly burdensome (or intrusive), lead to improvement in teaching and learning, satisfy increasingly rigorous accountability demands, assist in planning processes, AND meet ACCJC standards.
  • 8. Why program-level assessment is the greatest challenge for CCs So many programs (and many “phantom” programs) So many inter-disciplinary programs, often with no one directly responsible for their ensuring their effectiveness Assessment methods that work well for university programs often don’t work as well for us
  • 9. Assessing Program-Level Outcomes Define the program’s learning outcomes (what we want students to be able to think, do, or know when they’ve completed it). Check for alignment between curriculum and outcomes. Develop an assessment plan. Collect assessment data. Use this information for improvement, for planning, for accountability, for resource allocation. Routinely examine the assessment process itself. adapted in part from Mary J. Allen's work at CSU Bakersfield
  • 10. What Are SLOs? (a Refresher) They can be defined at any instructional level, ranging from the specific lesson all the way up to the institution. As opposed to “objectives,” they emphasize application of knowledge (what a student can do at the end of a course of instruction). They are not discrete or highly specific skills but aggregated complexes of knowledge, ability, and attitudes. They represent the broadest goals of the course or program.
  • 11. Program-Level SLO Suggestions The fewer the better. Don’t get too hung up on language--this is only the first (and by far the simplest) part of the assessment cycle, and you can change/refine SLOs later as necessary. Make sure the outcome is something that can be assessed. Work collaboratively as much as possible. Solicit advice from advisory groups, licensing/accrediting boards, etc.—and do a google or ERIC search of counterpart program SLOs at other community colleges Think about course-program alignment, but recognize that some course SLOs won’t map to program SLOs (and some program SLOs will depend on OLEs for achievement).
  • 12. Some Examples of Programmatic SLOs Computer/Business Applications Certificate: Productively work as a team member with people with diverse backgrounds in a workplace environment. Communicate effectively in support of a business office, including production and design of complex electronic and paper-based correspondence and documents. Use the Internet, a wide variety of computer application and standard business procedures to compute, analyze business performance, and solve problems. Actively assist in implementing general office procedures, including records management Demonstrate high qualities of self-management and self-awareness in terms of workshop responsibility and productivity.
  • 13. Music Demonstrate understanding of the fundamental melodic, harmonic, and rhythmic structure of music. Demonstrate fluency with the language of music in written and aural form. Perform on an instrument (or voice) at college sophomore level. Perform effectively in a musical ensemble. Use the piano keyboard to demonstrate and apply musical concepts. Demonstrate understanding of the historical development of music.
  • 14. English 80% of a sample of graduating English majors in a literature survey course will be able to score at least 70% on a test designed to measure their success in identifying authors, in placing them in their historical periods, and in knowing the titles of their major works.” (an operational SLO) At graduation, students are able to write a clear, coherent, and persuasive essay demonstrating their ability to analyze and interpret texts, to apply secondary criticism to them, and to explain their contexts.
  • 15. Some Practical Advice in Writing Programmatic SLOs Identify the most important things a student should leave your program being able TO DO (or know). Address student competency rather than content coverage. (Try for no more than five SLOs.) Consider: course SLOs and major assignments; transfer alignment needs, external licensure/accreditation requirements; employer needs; alumni feedback. Use active verbs to craft sentences that are clear, intelligible to students. Ensure that the SLO is assessable, measurable. Share draft SLO with colleagues to sharpen focus.
  • 16. Some hands on Using the worksheet, let’s spend 20 minutes or so trying to develop a short (but comprehensive) list of outcomes for a degree or certificate in our discipline. In the last five minutes, let’s trade with someone outside the discipline for feedback.
  • 17. Program SLO Checklist Outcomes are written using action verbs. The language indicates the program’s big picture, not nuts & bolts. Outcomes describe what students can DO, not what the program’s goals are. They address student competency (how they apply what they’ve learned) rather than content coverage.
  • 18. Once We Have Programmatic SLOs . . . Where should they appear? In the college catalog? On the college website (and any program-focused web pages)? On brochures, posters, etc. that describe our programs? Other?
  • 19. Some direct (i.e., performance-based) methods to assess learning in programs Look at work that students are already doing in courses to determine if, and to what extent, it demonstrates their achievement of program-level competencies. Administer nationally normed (or locally developed) tests of program-level competencies. Have students reflect on their values, attitudes, and beliefs (if developing these is an intended outcome of the program) In C-T programs, have employers rate skills of recent graduates. In C-T programs, use scores and pass rates on appropriate licensure exams that can be aligned with program SLOs.
  • 20. Using student work produced in classes to assess programs In capstone courses (or de facto capstone courses) Through portfolios , either pape r or electronic (i.e., by taking a second look at essays, projects, presentations, etc. that students are already doing in courses) Through individual faculty assessments at the course level, either informally or through an assessment data management system As a beginning, through a course-program assessment matrix
  • 21. A course-program assessment matrix If you’ve defined (and are assessing) course-level SLOs, you can use a matrix to gather information about course-level learning that potentially maps to program-level competencies. Two caveats: A student may not be in a specific program just because she is taking a class required by that program. (Most “GE” courses serve three or more “programs” simultaneously.) Students may achieve some program-level outcomes through co-curricular activities, not coursework.
  • 22.  
  • 23. Some problems with direct methods to assess programs They are often labor intensive They often don’t allow for cross-institutional comparison They too often tacitly depend on the judgments of individual instructors, working in isolation from each other, about their own students In the case of externally designed standardized tests, instructors may mistrust results
  • 24. Some indirect methods to assess learning in programs Student surveys of self-perceived learning gains Student engagement surveys (e.g., the CCSSE ) Alumni surveys Focus groups (of exiting students, alumni, etc.) Faculty surveys For C-T courses, surveys of or interviews with employers For C-T courses, employment data IR data on student retention and success in programs (though these can be deceptive indicators of “student learning.”)
  • 25. A note on student self-reports They are likely to be moderately valid measures of student learning when: The information requested is known to students. Questions are phrased clearly and unambiguously. Questions refer to recent activities. Students think questions merit a serious and thoughtful response. Questions don’t threaten or embarrass the student. G.D. Kuh, "Using student and alumni surveys for accountability in higher education" (2005)
  • 26. Developing an assessment plan Some models: Riverside Community College District Cabrillo College Community College of Baltimore County University of South Alabama (using TracDat)
  • 27. Some features of good systematic program-level assessment Units are expected to undertake PLA assessment cycles regularly and report results annually . (Reports include all Nichols-column information: 1) SLO(s) assessed, 2) assessment method(s) employed, 3) brief description of data generated by assessment, 4) how results were used for improvement.) Assessment reports are read and used as part of strategic planning processes. Resource allocation decisions take assessment results into consideration (e.g., requests for new technology are fulfilled on the basis of how it has been shown to improve learning.) It is a component of the program review process, which itself is a component of planning processes.
  • 29. Some final thoughts on making this work for us . . . Annual assessment updates should be short, in a standardized format, and preferably organized into a searchable database. A routine expectation of administrators and faculty is that they participate regularly in the assessment process—with faculty preferably working collaboratively to define and assess outcomes, and use results for improvement. Hold off (for now) on interdisciplinary program assessment, but eventually assign responsibility (maybe to “coordinating disciplines”). New programs should have SLOs and assessment plans as a condition of approval. Planners need to use assessment results and program reviews to define what the institution does well (and not well). Their decisions need to be driven by empirical evidence related to student learning.