SlideShare a Scribd company logo
D E S I G N I N G R U B R I C S 

F O R C O M P E T E N C Y- B A S E D E D U C AT I O N
C E N T E R F O R O N L I N E I N N O VAT I O N I N L E A R N I N G ( C O I L )
C O M P E T E N C Y- B A S E D E D U C AT I O N W O R K S H O P ( 1 0 / 2 9 / 2 0 1 5 )
K Y L E P E C K 

P R O F E S S O R O F E D U C A T I O N
P E N N S TA T E U N I V E R S I T Y
“ W H AT W I L L P E O P L E N E E D F R O M U S ? ”
• Access to high-quality content will increasingly be free.
• MOOCs and other forms of peer- and machine-
evaluated learning experiences will improve dramatically.
• Our primary service will be issuing high-quality
credentials based on high-quality assessments of 

higher-order capabilities.
• (And perhaps creating and sustaining learning
communities, but that’s another discussion.)
T H E P O W E R O F “ F E E D B A C K ”
P E R F O R M A N C E
A S S E S S M E N T
F E E D B A C K
F O C U S E D 

E F F O RT
I M P R O V E M E N T
C O N F I D E N C E
COM P E T E N C Y
T H E R U B R I C : T H E P R E F E R R E D WAY T O
A S S E S S H I G H E R - O R D E R L E A R N I N G
• Easy to use and to explain
• Make expectations very clear to learners
• Provide students with more and better feedback about
their strengths and areas in need of improvement
• Support learning and the development of skills
• Support good thinking.
• Based on “Rubrics and Bloom’s Taxonomy” Wiki at https://hillerspires.wikispaces.com/ Licensed under a Creative Commons
Attribution Share-Alike 3.0 License.
W H AT I S A R U B R I C ?
• A rubric is a scoring guide that seeks to evaluate a student's
performance based on the sum of a full range of criteria rather
than a single numerical score.
• A rubric is an authentic assessment tool used to measure students'
work.
• Authentic assessment is used to evaluate students' work by
measuring the product according to real-life criteria.
• A rubric is a working guide for students and teachers, usually
handed out before the assignment begins in order to get students
to think about the criteria on which their work will be judged.
From teachervision.com
T Y P E S O F R U B R I C S
• “Holistic” Rubrics
• “Analytical Rubrics”
• “Task-Specific” Rubrics
• “General” Rubrics
H O L I S T I C R U B R I C S
• Make a single assessment on the “overall quality” of
the project
• Are quick and easy
• May be reliable (?) but are not likely to be as valid as
analytic rubrics
• Provide little information to the user
A N E X A M P L E O F A H O L I S T I C R U B R I C
“Consistently does all or
most of the following:”
(List of good things)
“Does most or many of
the following:”

(Same list of good things)
“Does most or many of
the following:”
(List of bad things)
“Consistently does all or
almost all of the
following:”
(Moderately different list of bad things)
H O L I S T I C R U B R I C S ?
• Good for Sorting.
• Not good for understanding or
improving performance
A N A LY T I C R U B R I C S
• Identify the criteria that are important to a quality
product or performance.
• Identify levels or ratings for each criterion
• Identify descriptions of performances on each
criterion at each level
• Often provide scores for each criterion, based on
ratings and sum the scores to get an overall score or
to produce a grade.
A N AT O M Y O F A ( T Y P I C A L ) A N A LY T I C R U B R I C
Image from “Rubrics and Bloom’s Taxonomy” Wiki at https://hillerspires.wikispaces.com/ and is licensed under a
Creative Commons Attribution Share-Alike 3.0 License.
A S A M P L E R U B R I C
Rubric for a Chocolate Chip Cookie
Criteria
Ratings
Descriptions
W H AT ’ S W R O N G W I T H T H I S R U B R I C ?
Rubric for a Chocolate Chip Cookie?
A T E M P L AT E F O R A N A N A LY T I C R U B R I C
http://www.slideshare.net/missreynova/a-
sample-of-analytic-scoring-rubrics?related=1
M O S T R U B R I C S 

H AV E R E A L P R O B L E M S ! !
• Validity, the extent to which an assessment
measures what it claims to measure, is compromised
by an imbalance in the number of criteria of a given
type, which places undue emphasis on less important
factors.
• When there in an imbalance among criteria, the
resulting assessment is misleading.
• This can be resolved by grouping and weighting
criteria, but most rubrics don’t.
M O S T R U B R I C S 

H AV E R E A L P R O B L E M S ! !
• Reliability, the extent to which an assessment
produces stable scores for the same product or
performance when used by different reviewers or
when used repeatedly by the same reviewer, is
compromised by using the same number of ratings
(usually 4) for each criterion.
• Some criteria really have only two levels, others may have many.
• Forcing an inappropriate number of categories will increase the
probability that raters will choose different ratings.
• Using ratings that reflect actual performance will increase reliability.
M O S T R U B R I C S 

H AV E R E A L P R O B L E M S ! !
• When “multidimensional” criteria are used:
• the quality and utility of feedback are
reduced
• scoring is made more difficult, and
• reliability and validity are reduced.
A N E X A M P L E O F M U LT I - D I M E N S I O N A L C R I T E R I A
https://www.csusm.edu/ids/course-design-and-instruction/assessment/rubrics/writing_rubric_Northeastern.pdf
“Offers solid but less original reasoning.

Assumptions are not always recognized
or made explicit.

Contains some appropriate details or
examples.”
A B E T T E R , C O M P E T E N C Y- B A S E D A N A LY T I C R U B R I C
• Weighted “criteria” collect weighted “indicators”
• Indicators can have different numbers of ratings, which have numeric values and can 

indicate mastery
• Descriptions, importance statements, recommendations, and more are stored “behind the scenes”

when rubrics are created.
N E W R U B R I C - B A S E D T O O L S ?
• When assessing, selecting a rating provides stored descriptions and recommendations

that will be collected to form a narrative report.
• These may be edited to personalize the message, as needed.
• After the assessments are complete, emails to students and aggregated reports may be generated.
The Penn State “Rubric Processor”
V I S U A L R E P O R T F R O M T H E R U B R I C P R O C E S S O R
• Red indicates a performance rating that is not at the competency/mastery level.
• Gold indicates a performance that has been identified as competency.
• A “score” is calculated based on weightings.
• Aggregated scores from groups of learners can be represented as percentages in each cell.
• Group displays could be used as a “visual query generator,” calling up random, anonymized

examples of student work at each level.
I N S U M M A RY:
• Assessments of higher-order work will become our
primary business.
• Rubrics are an excellent way to provide high quality
assessments of higher-order products and performances.
• All rubrics are not created equal; Most have serious
flaws.
• Effective rubrics are based on sound learning outcomes
and corresponding assignments that elicit the desired
performances.
N O W I T ’ S Y O U R T U R N !
In your table groups, follow these steps to create a rubric:
!
• Start with a well-written learning outcome for a higher-
order task.
• Identify the aspects of the product or performance that
determine quality. Each of these becomes a criterion.
(Avoid multi-dimensional criteria! You can group criteria,
but don’t combine them.)
• Group and/or weight the criteria based on their
importance.
• Identify the levels of performance expected for each
criterion, assign scores for each rating and/or determine
which level(s) will be accepted as “mastery” or
“competency.”
N E X T S T E P S T O C O M P L E T E Y O U R R U B R I C
Consider the following to increase the validity and reliability of
your rubric:
!
• Share the rubric with experts to establish reliability and
identify ways to improve it.
• Pilot the rubric with learners to determine:
1. Whether the number of ratings for each criterion is
appropriate
2. Whether they convey adequate information to users to
result in improvement upon re-submission
3. Whether the top products or performances as indicated by
the rubric match experts’ holistic impressions.
• Revise as necessary.
• Celebrate!
T H A N K Y O U .
This presentation is available on slideshare.com.

More Related Content

PPTX
Instructional leadership
PPT
ASSESSMENT: SUMATIVE & FORMATIVE ASSESSMENT
PDF
FS6 Episode 4: The Teacher in the Community
PPT
Action Plan Powerpoint
PPT
Performance Assessment
PDF
Item Analysis - Discrimination and Difficulty Index
DOCX
Field study 3 anwsers
DOCX
Field Study 5 Episode 2
Instructional leadership
ASSESSMENT: SUMATIVE & FORMATIVE ASSESSMENT
FS6 Episode 4: The Teacher in the Community
Action Plan Powerpoint
Performance Assessment
Item Analysis - Discrimination and Difficulty Index
Field study 3 anwsers
Field Study 5 Episode 2

What's hot (20)

DOCX
Fs4 Episode 1-12
DOCX
Fs5 episode1
DOCX
Field Study 4 Episode 4
PDF
FS 7 Episode 10: Come, Let's Participate in School Activities
PPTX
Assessment Procedures and Techniques
DOCX
Field Study 4 Episode 3
PPTX
Outcomes-Based Education
PPTX
Selecting, preparing, using and developing instructional materials
PPTX
Process of curriculum changes (2)
DOCX
Field Study 5 Episode 5
DOCX
FIELD STUDY 5
PPTX
Presentation how to design rubrics
PPT
Effective Use of Teaching Aids
PPTX
Performance based-assessment
PPTX
Principles of Teaching
PPSX
DOCX
Practice Teaching Portfolio Episode 10
PPTX
Taxonomy of Educational Objectives - The Affective Domain
DOCX
FS4 Exploring the curriculum
PPTX
scoring rubrics
Fs4 Episode 1-12
Fs5 episode1
Field Study 4 Episode 4
FS 7 Episode 10: Come, Let's Participate in School Activities
Assessment Procedures and Techniques
Field Study 4 Episode 3
Outcomes-Based Education
Selecting, preparing, using and developing instructional materials
Process of curriculum changes (2)
Field Study 5 Episode 5
FIELD STUDY 5
Presentation how to design rubrics
Effective Use of Teaching Aids
Performance based-assessment
Principles of Teaching
Practice Teaching Portfolio Episode 10
Taxonomy of Educational Objectives - The Affective Domain
FS4 Exploring the curriculum
scoring rubrics
Ad

Viewers also liked (20)

PPTX
Rubrics (Analytic and Holistic)
PPT
Creating rubrics
PPT
Rubrics
PPTX
K to12 ASSESSMENT AND RATING OF LEARNING OUTCOMES
PPT
Introduction to Rubrics
PPTX
Rubrics
PPTX
Rubric PPP
PPTX
Creating Descriptive Rubrics for Educational Assessment
PPTX
Automating performance based tests
PPT
Creating Rubrics
PDF
RUBRICS - ALL GRADES
PPT
Assessment of Student Learning 2: Rubrics
PPT
Rubrics presentation
PPT
Interdisciplinary unit pp
PDF
Preparing a rubric
PPTX
Corning Roundtable
PPT
Year 07 Assessment Workshop for Parents, November 2011
PDF
Sound%20 design%20%28ch%204 7%29
KEY
MYP Assessment Workshop for Parents
PPTX
IB Curriculum
Rubrics (Analytic and Holistic)
Creating rubrics
Rubrics
K to12 ASSESSMENT AND RATING OF LEARNING OUTCOMES
Introduction to Rubrics
Rubrics
Rubric PPP
Creating Descriptive Rubrics for Educational Assessment
Automating performance based tests
Creating Rubrics
RUBRICS - ALL GRADES
Assessment of Student Learning 2: Rubrics
Rubrics presentation
Interdisciplinary unit pp
Preparing a rubric
Corning Roundtable
Year 07 Assessment Workshop for Parents, November 2011
Sound%20 design%20%28ch%204 7%29
MYP Assessment Workshop for Parents
IB Curriculum
Ad

Similar to Designing Rubrics for Competency-based Education (20)

PPTX
ASSESS2 L6.pptxNDJWDJWHDWBDWNSDWBDJWJDNW
PPT
Paratesol Workshop on Rubrics
PDF
2.26.15-CEIT-Assessment-Rubric-Development-PowerPoint.pdf
PPTX
Performance based assessment
PPTX
Performance based assessment
PPTX
Rubric design workshop
PPTX
How to write an ideal Rubric presentation
PPTX
RUBRICS.pptx
PPTX
Processing-in-Developing-and-using-the-Rubric-in-Alternative-Assessment (1).pptx
PPTX
Assessment-of-Learning-Group-6.pptx pROCESS IN DEVELOPING AND USING RUBRICS
PDF
Interrater Reliability Made Easy
PPTX
Rubrics: Transparent Assessment in Support of Learning
PPTX
Rubrics for grading 9.4.1
PDF
13. using rubrics in student assessment
PDF
ASSESMENT IN LEARNING 1213191Y26227462177
PDF
ASSESMENT IN LEARNING 1213191Y26227462177
PPTX
rubric_development.pptx
PPTX
Rubrics- the Versatile and Practical Choice February 2016
PPTX
Lesson on Learning and Assessment of Education
PPTX
Rubrics discussion and sample and notes.pptx
ASSESS2 L6.pptxNDJWDJWHDWBDWNSDWBDJWJDNW
Paratesol Workshop on Rubrics
2.26.15-CEIT-Assessment-Rubric-Development-PowerPoint.pdf
Performance based assessment
Performance based assessment
Rubric design workshop
How to write an ideal Rubric presentation
RUBRICS.pptx
Processing-in-Developing-and-using-the-Rubric-in-Alternative-Assessment (1).pptx
Assessment-of-Learning-Group-6.pptx pROCESS IN DEVELOPING AND USING RUBRICS
Interrater Reliability Made Easy
Rubrics: Transparent Assessment in Support of Learning
Rubrics for grading 9.4.1
13. using rubrics in student assessment
ASSESMENT IN LEARNING 1213191Y26227462177
ASSESMENT IN LEARNING 1213191Y26227462177
rubric_development.pptx
Rubrics- the Versatile and Practical Choice February 2016
Lesson on Learning and Assessment of Education
Rubrics discussion and sample and notes.pptx

Recently uploaded (20)

PDF
Weekly quiz Compilation Jan -July 25.pdf
PPTX
B.Sc. DS Unit 2 Software Engineering.pptx
PDF
LDMMIA Reiki Yoga Finals Review Spring Summer
PPTX
20th Century Theater, Methods, History.pptx
PPTX
Share_Module_2_Power_conflict_and_negotiation.pptx
PPTX
Introduction to Building Materials
PDF
OBE - B.A.(HON'S) IN INTERIOR ARCHITECTURE -Ar.MOHIUDDIN.pdf
PPTX
A powerpoint presentation on the Revised K-10 Science Shaping Paper
PDF
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS
PPTX
Computer Architecture Input Output Memory.pptx
PDF
ChatGPT for Dummies - Pam Baker Ccesa007.pdf
PDF
medical_surgical_nursing_10th_edition_ignatavicius_TEST_BANK_pdf.pdf
PDF
David L Page_DCI Research Study Journey_how Methodology can inform one's prac...
PDF
BP 704 T. NOVEL DRUG DELIVERY SYSTEMS (UNIT 1)
DOC
Soft-furnishing-By-Architect-A.F.M.Mohiuddin-Akhand.doc
PDF
Empowerment Technology for Senior High School Guide
PDF
My India Quiz Book_20210205121199924.pdf
PDF
Computing-Curriculum for Schools in Ghana
PDF
advance database management system book.pdf
PDF
Indian roads congress 037 - 2012 Flexible pavement
Weekly quiz Compilation Jan -July 25.pdf
B.Sc. DS Unit 2 Software Engineering.pptx
LDMMIA Reiki Yoga Finals Review Spring Summer
20th Century Theater, Methods, History.pptx
Share_Module_2_Power_conflict_and_negotiation.pptx
Introduction to Building Materials
OBE - B.A.(HON'S) IN INTERIOR ARCHITECTURE -Ar.MOHIUDDIN.pdf
A powerpoint presentation on the Revised K-10 Science Shaping Paper
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS
Computer Architecture Input Output Memory.pptx
ChatGPT for Dummies - Pam Baker Ccesa007.pdf
medical_surgical_nursing_10th_edition_ignatavicius_TEST_BANK_pdf.pdf
David L Page_DCI Research Study Journey_how Methodology can inform one's prac...
BP 704 T. NOVEL DRUG DELIVERY SYSTEMS (UNIT 1)
Soft-furnishing-By-Architect-A.F.M.Mohiuddin-Akhand.doc
Empowerment Technology for Senior High School Guide
My India Quiz Book_20210205121199924.pdf
Computing-Curriculum for Schools in Ghana
advance database management system book.pdf
Indian roads congress 037 - 2012 Flexible pavement

Designing Rubrics for Competency-based Education

  • 1. D E S I G N I N G R U B R I C S 
 F O R C O M P E T E N C Y- B A S E D E D U C AT I O N C E N T E R F O R O N L I N E I N N O VAT I O N I N L E A R N I N G ( C O I L ) C O M P E T E N C Y- B A S E D E D U C AT I O N W O R K S H O P ( 1 0 / 2 9 / 2 0 1 5 ) K Y L E P E C K 
 P R O F E S S O R O F E D U C A T I O N P E N N S TA T E U N I V E R S I T Y
  • 2. “ W H AT W I L L P E O P L E N E E D F R O M U S ? ” • Access to high-quality content will increasingly be free. • MOOCs and other forms of peer- and machine- evaluated learning experiences will improve dramatically. • Our primary service will be issuing high-quality credentials based on high-quality assessments of 
 higher-order capabilities. • (And perhaps creating and sustaining learning communities, but that’s another discussion.)
  • 3. T H E P O W E R O F “ F E E D B A C K ” P E R F O R M A N C E A S S E S S M E N T F E E D B A C K F O C U S E D 
 E F F O RT I M P R O V E M E N T C O N F I D E N C E COM P E T E N C Y
  • 4. T H E R U B R I C : T H E P R E F E R R E D WAY T O A S S E S S H I G H E R - O R D E R L E A R N I N G • Easy to use and to explain • Make expectations very clear to learners • Provide students with more and better feedback about their strengths and areas in need of improvement • Support learning and the development of skills • Support good thinking. • Based on “Rubrics and Bloom’s Taxonomy” Wiki at https://hillerspires.wikispaces.com/ Licensed under a Creative Commons Attribution Share-Alike 3.0 License.
  • 5. W H AT I S A R U B R I C ? • A rubric is a scoring guide that seeks to evaluate a student's performance based on the sum of a full range of criteria rather than a single numerical score. • A rubric is an authentic assessment tool used to measure students' work. • Authentic assessment is used to evaluate students' work by measuring the product according to real-life criteria. • A rubric is a working guide for students and teachers, usually handed out before the assignment begins in order to get students to think about the criteria on which their work will be judged. From teachervision.com
  • 6. T Y P E S O F R U B R I C S • “Holistic” Rubrics • “Analytical Rubrics” • “Task-Specific” Rubrics • “General” Rubrics
  • 7. H O L I S T I C R U B R I C S • Make a single assessment on the “overall quality” of the project • Are quick and easy • May be reliable (?) but are not likely to be as valid as analytic rubrics • Provide little information to the user
  • 8. A N E X A M P L E O F A H O L I S T I C R U B R I C “Consistently does all or most of the following:” (List of good things) “Does most or many of the following:”
 (Same list of good things) “Does most or many of the following:” (List of bad things) “Consistently does all or almost all of the following:” (Moderately different list of bad things)
  • 9. H O L I S T I C R U B R I C S ? • Good for Sorting. • Not good for understanding or improving performance
  • 10. A N A LY T I C R U B R I C S • Identify the criteria that are important to a quality product or performance. • Identify levels or ratings for each criterion • Identify descriptions of performances on each criterion at each level • Often provide scores for each criterion, based on ratings and sum the scores to get an overall score or to produce a grade.
  • 11. A N AT O M Y O F A ( T Y P I C A L ) A N A LY T I C R U B R I C Image from “Rubrics and Bloom’s Taxonomy” Wiki at https://hillerspires.wikispaces.com/ and is licensed under a Creative Commons Attribution Share-Alike 3.0 License.
  • 12. A S A M P L E R U B R I C Rubric for a Chocolate Chip Cookie Criteria Ratings Descriptions
  • 13. W H AT ’ S W R O N G W I T H T H I S R U B R I C ? Rubric for a Chocolate Chip Cookie?
  • 14. A T E M P L AT E F O R A N A N A LY T I C R U B R I C http://www.slideshare.net/missreynova/a- sample-of-analytic-scoring-rubrics?related=1
  • 15. M O S T R U B R I C S 
 H AV E R E A L P R O B L E M S ! ! • Validity, the extent to which an assessment measures what it claims to measure, is compromised by an imbalance in the number of criteria of a given type, which places undue emphasis on less important factors. • When there in an imbalance among criteria, the resulting assessment is misleading. • This can be resolved by grouping and weighting criteria, but most rubrics don’t.
  • 16. M O S T R U B R I C S 
 H AV E R E A L P R O B L E M S ! ! • Reliability, the extent to which an assessment produces stable scores for the same product or performance when used by different reviewers or when used repeatedly by the same reviewer, is compromised by using the same number of ratings (usually 4) for each criterion. • Some criteria really have only two levels, others may have many. • Forcing an inappropriate number of categories will increase the probability that raters will choose different ratings. • Using ratings that reflect actual performance will increase reliability.
  • 17. M O S T R U B R I C S 
 H AV E R E A L P R O B L E M S ! ! • When “multidimensional” criteria are used: • the quality and utility of feedback are reduced • scoring is made more difficult, and • reliability and validity are reduced.
  • 18. A N E X A M P L E O F M U LT I - D I M E N S I O N A L C R I T E R I A https://www.csusm.edu/ids/course-design-and-instruction/assessment/rubrics/writing_rubric_Northeastern.pdf “Offers solid but less original reasoning.
 Assumptions are not always recognized or made explicit.
 Contains some appropriate details or examples.”
  • 19. A B E T T E R , C O M P E T E N C Y- B A S E D A N A LY T I C R U B R I C • Weighted “criteria” collect weighted “indicators” • Indicators can have different numbers of ratings, which have numeric values and can 
 indicate mastery • Descriptions, importance statements, recommendations, and more are stored “behind the scenes”
 when rubrics are created.
  • 20. N E W R U B R I C - B A S E D T O O L S ? • When assessing, selecting a rating provides stored descriptions and recommendations
 that will be collected to form a narrative report. • These may be edited to personalize the message, as needed. • After the assessments are complete, emails to students and aggregated reports may be generated. The Penn State “Rubric Processor”
  • 21. V I S U A L R E P O R T F R O M T H E R U B R I C P R O C E S S O R • Red indicates a performance rating that is not at the competency/mastery level. • Gold indicates a performance that has been identified as competency. • A “score” is calculated based on weightings. • Aggregated scores from groups of learners can be represented as percentages in each cell. • Group displays could be used as a “visual query generator,” calling up random, anonymized
 examples of student work at each level.
  • 22. I N S U M M A RY: • Assessments of higher-order work will become our primary business. • Rubrics are an excellent way to provide high quality assessments of higher-order products and performances. • All rubrics are not created equal; Most have serious flaws. • Effective rubrics are based on sound learning outcomes and corresponding assignments that elicit the desired performances.
  • 23. N O W I T ’ S Y O U R T U R N ! In your table groups, follow these steps to create a rubric: ! • Start with a well-written learning outcome for a higher- order task. • Identify the aspects of the product or performance that determine quality. Each of these becomes a criterion. (Avoid multi-dimensional criteria! You can group criteria, but don’t combine them.) • Group and/or weight the criteria based on their importance. • Identify the levels of performance expected for each criterion, assign scores for each rating and/or determine which level(s) will be accepted as “mastery” or “competency.”
  • 24. N E X T S T E P S T O C O M P L E T E Y O U R R U B R I C Consider the following to increase the validity and reliability of your rubric: ! • Share the rubric with experts to establish reliability and identify ways to improve it. • Pilot the rubric with learners to determine: 1. Whether the number of ratings for each criterion is appropriate 2. Whether they convey adequate information to users to result in improvement upon re-submission 3. Whether the top products or performances as indicated by the rubric match experts’ holistic impressions. • Revise as necessary. • Celebrate!
  • 25. T H A N K Y O U . This presentation is available on slideshare.com.