1
PERFORMANCE MEASURE RUBRIC
Template #3-Performance Measure Rubric-June 2014-FINAL
2
The Process for…
Assuring the quality of “Teacher-Developed”
Student Performance Measures
Introduction
The purpose of this document is to provide guidance when developing measures of
student performance that will meet the criteria within the Performance Measure Rubric. The
rubric is designed as a self-assessment tool used in ascertaining the quality of the developed
performance measure created by educators for local use. The process used to design, build, and
review teacher-made performance measures is contained within the Assessment Literacy Series
(ALS), Quick Start program.
The Quick Start program is foundational, professional training using a process of creating
high-quality performance measures. Using a step-by-step set of procedures, educators develop
performance measures focused on the identified skills, knowledge, and concepts within the
targeted content standards. The entire program can be accessed through the Homeroom online
learning platform.
Purpose Statement
The enclosed rubric is designed to examine the quality characteristics of teacher-made
performance measures. The rubric is comprised of 18 technical descriptors organized into three
(3) strands. The rubric’s purpose is to provide teachers with a self-assessment tool that assists in
building high quality measures of student achievement.
Template #3-Performance Measure Rubric-June 2014-FINAL
3
Rating Tasks
Step 1. Review information, data, and documents associated with the design,
development, and review of the selected performance measure.
Step 2. Assign a value in the “Rating” column for each aspect within a particular strand
using the following scale:
a. (1) = fully addressed
b. (.5) = partially addressed
c. (0) = not addressed
d. (N/A) = not applicable at this time
Step 3. Reference supporting information associated with each assigned rating in the
“Evidence” column. All partial (.5) ratings should have statements in the
“Evidence” column identifying the shortfall(s).
Step 4. Add any additional notations and/or comments that articulate any important
nuances of the performance measure. Also note any corrective actions that may
occur in the future given the identified shortfall(s) in the technical evidence.
Step 5. Compile assigned values and place in the “Strand Summary” row.
Summary Matrix
Strand Points
Possible
Points
Earned
Design 5
Build 6
Review 7
Summary 18
Template #3-Performance Measure Rubric-June 2014-FINAL
4
STRAND 1: DESIGN
Task
ID
Descriptor Rating Evidence
1.1
The purpose of the performance measure is explicitly stated
(who, what, why).
1.2
The performance measure has targeted content standards
representing a range of knowledge and skills students are
expected to know and demonstrate.
1.3
The performance measure’s design is appropriate for the
intended audience and reflects challenging material needed
to develop higher-order thinking skills.
1.4
Specification tables articulate the number of items/tasks,
item/task types, passage readability, and other information
about the performance measure -OR- Blueprints are used
to align items/tasks to targeted content standards.
1.5
Items/tasks are rigorous (designed to measure a range of
cognitive demands/higher-order thinking skills at
developmentally appropriate levels) and of sufficient
quantities to measure the depth and breadth of the targeted
content standards.
Strand 1 Summary
___out of
5
Additional Comments/Notes
Template #3-Performance Measure Rubric-June 2014-FINAL
5
STRAND 2: BUILD
Task
ID
Descriptor Rating Evidence
2.1
Items/tasks and score keys are developed using
standardized procedures, including scoring rubrics for
human-scored, open-ended questions (e.g., short
constructed response, writing prompts, performance tasks,
etc.).
2.2
Item/tasks are created and reviewed in terms of: (a)
alignment to the targeted content standards, (b) content
accuracy, (c) developmental appropriateness, (d) cognitive
demand, and (e) bias, sensitivity, and fairness.
2.3
Administrative guidelines are developed that contain the
step-by-step procedures used to administer the performance
measure in a consistent manner, including scripts to orally
communicate directions to students, day and time
constraints, and allowable accommodations/adaptations.
2.4
Scoring guidelines are developed for human-scored
items/tasks to promote score consistency across items/tasks
and among different scorers. These guidelines articulate
point values for each item/task used to combine results into
an overall score.
2.5
Summary scores are reported using both raw score points
and performance level. Performance levels reflect the
range of scores possible on the assessment and use terms
or symbols to denote each level.
2.6
The total time to administer the performance measure is
developmentally appropriate for the test-taker. Generally,
this is 30 minutes or less for young students and up to 60
minutes per session for older students (high school).
Strand 2 Summary
___out of
6
Additional Comments/Notes
Template #3-Performance Measure Rubric-June 2014-FINAL
6
STRAND 3: REVIEW
Task
ID
Descriptor Rating Evidence
3.1
The performance measures are reviewed in terms of design
fidelity:
 Items/tasks are distributed based upon the design
properties found within the specification or blueprint
documents;
 Item/task and form statistics are used to examine
levels of difficulty, complexity, distracter quality, and
other properties; and,
 Items/tasks and forms are rigorous and free of bias,
sensitive, or unfair characteristics.
3.2
The performance measure was reviewed in terms of
editorial soundness, while ensuring consistency and
accuracy of all documents (e.g., administration guide):
 Identifies words, text, reading passages, and/or
graphics that require copyright permission or
acknowledgements;
 Applies Universal Design principles; and,
 Ensures linguistic demands and readability is
developmentally appropriate.
3.3
The performance measure was reviewed in terms of
alignment characteristics:
 Pattern consistency (within specifications and/or
blueprints);
 Targeted content standards match;
 Cognitive demand; and
 Developmental appropriateness.
3.4
Cut scores are established for each performance level.
Performance level descriptors describe the achievement
continuum using content-based competencies for each
assessed content area.
Additional Comments/Notes
Template #3-Performance Measure Rubric-June 2014-FINAL
7
Note: The indicators below are evaluated after students have taken the assessment
(i.e., post-administration).
STRAND 3: REVIEW (cont.)
Task
ID
Descriptor Rating Evidence
3.5
As part of the assessment cycle, post administration
analyses are conducted to examine aspects such as
items/tasks performance, scale functioning, overall score
distribution, rater drift, content alignment, etc.
3.6
The performance measure has score validity evidence that
demonstrate item responses were consistent with content
specifications. Data suggest that the scores represent the
intended construct by using an adequate sample of
items/tasks within the targeted content standards. Other
sources of validity evidence such as the interrelationship of
items/tasks and alignment characteristics of the
performance measure are collected.
3.7
Reliability coefficients are reported for the performance
measure, which includes estimating internal consistency.
Standard errors are reported for summary scores. When
applicable, other reliability statistics such as classification
accuracy, rater reliability, etc. are calculated and reviewed.
Strand 3 Summary
___out of
7
Additional Comments/Notes

More Related Content

DOCX
Template #6-Performance Measure Rubric-May 2014-Final
DOCX
Template #3a performance measure rubric-final-la
DOCX
HO #3-Reviewing the Assessment-Scored Example-June 2014-FINAL
PDF
Performance measure rubric
PDF
HO #3-Reviewing the Assessment-Scored Example-22JAN14
DOCX
Template #3 coherency rubric-final-jp
PDF
M3 Reviewing Trainer Notes
PPTX
M3 reviewing the slo-sso-final
Template #6-Performance Measure Rubric-May 2014-Final
Template #3a performance measure rubric-final-la
HO #3-Reviewing the Assessment-Scored Example-June 2014-FINAL
Performance measure rubric
HO #3-Reviewing the Assessment-Scored Example-22JAN14
Template #3 coherency rubric-final-jp
M3 Reviewing Trainer Notes
M3 reviewing the slo-sso-final

What's hot (18)

PPTX
M3-Reviewing the SLO-SSO-DemoSite
PDF
PID3687979
PPTX
Parcc public blueprints narrated math 04262013
PDF
Student growth workshops
PPTX
Teacher evaluation form
PPTX
Haxhiraj ch13 14-presentation
PPTX
Angelita chapter 12
PPT
Orientation for P1-3
PPTX
Tieka wilkins chapter 12
PPTX
Chapter IV: Programme of Action
PDF
Apt chapter 11
PDF
Results based performance management system rpms- for dep ed
PDF
[Appendix 1 a] rpms tool for proficient teachers sy 2021 2022 in the time of ...
PPTX
M1 designing the slo-sso-final
PPTX
M1 Designing the SLO-SSO-Demo Site
PPTX
Isd model
PDF
Module 12 slideshare
DOCX
AET 535 Technology levels--snaptutorial.com
M3-Reviewing the SLO-SSO-DemoSite
PID3687979
Parcc public blueprints narrated math 04262013
Student growth workshops
Teacher evaluation form
Haxhiraj ch13 14-presentation
Angelita chapter 12
Orientation for P1-3
Tieka wilkins chapter 12
Chapter IV: Programme of Action
Apt chapter 11
Results based performance management system rpms- for dep ed
[Appendix 1 a] rpms tool for proficient teachers sy 2021 2022 in the time of ...
M1 designing the slo-sso-final
M1 Designing the SLO-SSO-Demo Site
Isd model
Module 12 slideshare
AET 535 Technology levels--snaptutorial.com
Ad

Similar to Template #3-Performance Measure Rubric-June 2014-FINAL (20)

PPTX
LESSON-3-AND-4-REPORT EDITED POWERPOINT 1
PPT
rubricpp-131007133301-phpapp0pppppppppppppppppp1.ppt
DOCX
Quick Start Users Guide-June 2014-Working Draft
PPT
Creating Rubrics
PPTX
performance_assessment.pptx
PPTX
Lesson on Learning and Assessment of Education
PPTX
Performance Assessment Scoring Rubrics_7.12.2021 (1).pptx
PDF
Quality Assessments-- Rubrics
PPTX
PERFORMANCE-BASED-ASSESSMENT REPORT.pptx
PPTX
Performance-Based Assessment (Assessment of Learning 2, Chapter 2))
PPTX
How do you conduct performance assesment.pptx
PPTX
Assessment-of-Learning-Group-6.pptx pROCESS IN DEVELOPING AND USING RUBRICS
PPTX
Using rubrics to inform and assess student learning final 29 october 2010
PPTX
Rubrics.pptx
PPTX
Process orientedperformance-basedassessment
PPT
Performance assessment by rubric method
PPTX
Performance based assessment
PPTX
Performance based assessment
PPTX
Rubrics.pptx
PPT
Paratesol Workshop on Rubrics
LESSON-3-AND-4-REPORT EDITED POWERPOINT 1
rubricpp-131007133301-phpapp0pppppppppppppppppp1.ppt
Quick Start Users Guide-June 2014-Working Draft
Creating Rubrics
performance_assessment.pptx
Lesson on Learning and Assessment of Education
Performance Assessment Scoring Rubrics_7.12.2021 (1).pptx
Quality Assessments-- Rubrics
PERFORMANCE-BASED-ASSESSMENT REPORT.pptx
Performance-Based Assessment (Assessment of Learning 2, Chapter 2))
How do you conduct performance assesment.pptx
Assessment-of-Learning-Group-6.pptx pROCESS IN DEVELOPING AND USING RUBRICS
Using rubrics to inform and assess student learning final 29 october 2010
Rubrics.pptx
Process orientedperformance-basedassessment
Performance assessment by rubric method
Performance based assessment
Performance based assessment
Rubrics.pptx
Paratesol Workshop on Rubrics
Ad

More from Research in Action, Inc. (20)

PPTX
SLO for teachers
PPTX
M2-Building the SLO-SSO-DemoSite
PPTX
M0 School Leader Orientation-Demo Site
PPTX
M0 Orientation to the SLO-SSO-DemoSite
DOCX
Template #2c building the sso-final
DOCX
Template #2a building the slo-final
DOCX
Template #1 designing the slo-sso-final
PPTX
M2 building the slo-sso-final
PPTX
M0 school leader orientation-final
PPTX
M0 orientation to the slo-sso-final
PPTX
Educator evaluation policy overview-final
DOCX
Performance Task Framework-June 2014-FINAL
PDF
Depth of Knowledge Chart
PDF
Assessment Selection Paper-Herman_Heritage_Goldschmidt (2011)
DOCX
Model #3-Nutrition Culinary-Level III-DEMO-FINAL
PPTX
M3-Reviewing the Assessment-June 2014-FINAL
DOCX
Cognitive Demand Crosswalk-June 2014-FINAL
DOCX
Model #2-Grade 8 Pre-Algebra-DEMO-FINAL
DOCX
Model #1-Art Grade 5-DEMO-FINAL
DOCX
HO #2-Building the Assessment-Examples-June 2014-FINAL
SLO for teachers
M2-Building the SLO-SSO-DemoSite
M0 School Leader Orientation-Demo Site
M0 Orientation to the SLO-SSO-DemoSite
Template #2c building the sso-final
Template #2a building the slo-final
Template #1 designing the slo-sso-final
M2 building the slo-sso-final
M0 school leader orientation-final
M0 orientation to the slo-sso-final
Educator evaluation policy overview-final
Performance Task Framework-June 2014-FINAL
Depth of Knowledge Chart
Assessment Selection Paper-Herman_Heritage_Goldschmidt (2011)
Model #3-Nutrition Culinary-Level III-DEMO-FINAL
M3-Reviewing the Assessment-June 2014-FINAL
Cognitive Demand Crosswalk-June 2014-FINAL
Model #2-Grade 8 Pre-Algebra-DEMO-FINAL
Model #1-Art Grade 5-DEMO-FINAL
HO #2-Building the Assessment-Examples-June 2014-FINAL

Recently uploaded (20)

PDF
advance database management system book.pdf
PDF
Vision Prelims GS PYQ Analysis 2011-2022 www.upscpdf.com.pdf
PDF
1.3 FINAL REVISED K-10 PE and Health CG 2023 Grades 4-10 (1).pdf
PDF
OBE - B.A.(HON'S) IN INTERIOR ARCHITECTURE -Ar.MOHIUDDIN.pdf
PPTX
Chinmaya Tiranga Azadi Quiz (Class 7-8 )
PPTX
Unit 4 Computer Architecture Multicore Processor.pptx
PDF
My India Quiz Book_20210205121199924.pdf
PDF
BP 704 T. NOVEL DRUG DELIVERY SYSTEMS (UNIT 1)
PPTX
ELIAS-SEZIURE AND EPilepsy semmioan session.pptx
PPTX
Introduction to pro and eukaryotes and differences.pptx
PDF
ChatGPT for Dummies - Pam Baker Ccesa007.pdf
PPTX
TNA_Presentation-1-Final(SAVE)) (1).pptx
PDF
FORM 1 BIOLOGY MIND MAPS and their schemes
PDF
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS
PPTX
Virtual and Augmented Reality in Current Scenario
PDF
Practical Manual AGRO-233 Principles and Practices of Natural Farming
PDF
Paper A Mock Exam 9_ Attempt review.pdf.
PDF
Uderstanding digital marketing and marketing stratergie for engaging the digi...
PDF
Complications of Minimal Access-Surgery.pdf
PDF
Τίμαιος είναι φιλοσοφικός διάλογος του Πλάτωνα
advance database management system book.pdf
Vision Prelims GS PYQ Analysis 2011-2022 www.upscpdf.com.pdf
1.3 FINAL REVISED K-10 PE and Health CG 2023 Grades 4-10 (1).pdf
OBE - B.A.(HON'S) IN INTERIOR ARCHITECTURE -Ar.MOHIUDDIN.pdf
Chinmaya Tiranga Azadi Quiz (Class 7-8 )
Unit 4 Computer Architecture Multicore Processor.pptx
My India Quiz Book_20210205121199924.pdf
BP 704 T. NOVEL DRUG DELIVERY SYSTEMS (UNIT 1)
ELIAS-SEZIURE AND EPilepsy semmioan session.pptx
Introduction to pro and eukaryotes and differences.pptx
ChatGPT for Dummies - Pam Baker Ccesa007.pdf
TNA_Presentation-1-Final(SAVE)) (1).pptx
FORM 1 BIOLOGY MIND MAPS and their schemes
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS
Virtual and Augmented Reality in Current Scenario
Practical Manual AGRO-233 Principles and Practices of Natural Farming
Paper A Mock Exam 9_ Attempt review.pdf.
Uderstanding digital marketing and marketing stratergie for engaging the digi...
Complications of Minimal Access-Surgery.pdf
Τίμαιος είναι φιλοσοφικός διάλογος του Πλάτωνα

Template #3-Performance Measure Rubric-June 2014-FINAL

  • 2. Template #3-Performance Measure Rubric-June 2014-FINAL 2 The Process for… Assuring the quality of “Teacher-Developed” Student Performance Measures Introduction The purpose of this document is to provide guidance when developing measures of student performance that will meet the criteria within the Performance Measure Rubric. The rubric is designed as a self-assessment tool used in ascertaining the quality of the developed performance measure created by educators for local use. The process used to design, build, and review teacher-made performance measures is contained within the Assessment Literacy Series (ALS), Quick Start program. The Quick Start program is foundational, professional training using a process of creating high-quality performance measures. Using a step-by-step set of procedures, educators develop performance measures focused on the identified skills, knowledge, and concepts within the targeted content standards. The entire program can be accessed through the Homeroom online learning platform. Purpose Statement The enclosed rubric is designed to examine the quality characteristics of teacher-made performance measures. The rubric is comprised of 18 technical descriptors organized into three (3) strands. The rubric’s purpose is to provide teachers with a self-assessment tool that assists in building high quality measures of student achievement.
  • 3. Template #3-Performance Measure Rubric-June 2014-FINAL 3 Rating Tasks Step 1. Review information, data, and documents associated with the design, development, and review of the selected performance measure. Step 2. Assign a value in the “Rating” column for each aspect within a particular strand using the following scale: a. (1) = fully addressed b. (.5) = partially addressed c. (0) = not addressed d. (N/A) = not applicable at this time Step 3. Reference supporting information associated with each assigned rating in the “Evidence” column. All partial (.5) ratings should have statements in the “Evidence” column identifying the shortfall(s). Step 4. Add any additional notations and/or comments that articulate any important nuances of the performance measure. Also note any corrective actions that may occur in the future given the identified shortfall(s) in the technical evidence. Step 5. Compile assigned values and place in the “Strand Summary” row. Summary Matrix Strand Points Possible Points Earned Design 5 Build 6 Review 7 Summary 18
  • 4. Template #3-Performance Measure Rubric-June 2014-FINAL 4 STRAND 1: DESIGN Task ID Descriptor Rating Evidence 1.1 The purpose of the performance measure is explicitly stated (who, what, why). 1.2 The performance measure has targeted content standards representing a range of knowledge and skills students are expected to know and demonstrate. 1.3 The performance measure’s design is appropriate for the intended audience and reflects challenging material needed to develop higher-order thinking skills. 1.4 Specification tables articulate the number of items/tasks, item/task types, passage readability, and other information about the performance measure -OR- Blueprints are used to align items/tasks to targeted content standards. 1.5 Items/tasks are rigorous (designed to measure a range of cognitive demands/higher-order thinking skills at developmentally appropriate levels) and of sufficient quantities to measure the depth and breadth of the targeted content standards. Strand 1 Summary ___out of 5 Additional Comments/Notes
  • 5. Template #3-Performance Measure Rubric-June 2014-FINAL 5 STRAND 2: BUILD Task ID Descriptor Rating Evidence 2.1 Items/tasks and score keys are developed using standardized procedures, including scoring rubrics for human-scored, open-ended questions (e.g., short constructed response, writing prompts, performance tasks, etc.). 2.2 Item/tasks are created and reviewed in terms of: (a) alignment to the targeted content standards, (b) content accuracy, (c) developmental appropriateness, (d) cognitive demand, and (e) bias, sensitivity, and fairness. 2.3 Administrative guidelines are developed that contain the step-by-step procedures used to administer the performance measure in a consistent manner, including scripts to orally communicate directions to students, day and time constraints, and allowable accommodations/adaptations. 2.4 Scoring guidelines are developed for human-scored items/tasks to promote score consistency across items/tasks and among different scorers. These guidelines articulate point values for each item/task used to combine results into an overall score. 2.5 Summary scores are reported using both raw score points and performance level. Performance levels reflect the range of scores possible on the assessment and use terms or symbols to denote each level. 2.6 The total time to administer the performance measure is developmentally appropriate for the test-taker. Generally, this is 30 minutes or less for young students and up to 60 minutes per session for older students (high school). Strand 2 Summary ___out of 6 Additional Comments/Notes
  • 6. Template #3-Performance Measure Rubric-June 2014-FINAL 6 STRAND 3: REVIEW Task ID Descriptor Rating Evidence 3.1 The performance measures are reviewed in terms of design fidelity:  Items/tasks are distributed based upon the design properties found within the specification or blueprint documents;  Item/task and form statistics are used to examine levels of difficulty, complexity, distracter quality, and other properties; and,  Items/tasks and forms are rigorous and free of bias, sensitive, or unfair characteristics. 3.2 The performance measure was reviewed in terms of editorial soundness, while ensuring consistency and accuracy of all documents (e.g., administration guide):  Identifies words, text, reading passages, and/or graphics that require copyright permission or acknowledgements;  Applies Universal Design principles; and,  Ensures linguistic demands and readability is developmentally appropriate. 3.3 The performance measure was reviewed in terms of alignment characteristics:  Pattern consistency (within specifications and/or blueprints);  Targeted content standards match;  Cognitive demand; and  Developmental appropriateness. 3.4 Cut scores are established for each performance level. Performance level descriptors describe the achievement continuum using content-based competencies for each assessed content area. Additional Comments/Notes
  • 7. Template #3-Performance Measure Rubric-June 2014-FINAL 7 Note: The indicators below are evaluated after students have taken the assessment (i.e., post-administration). STRAND 3: REVIEW (cont.) Task ID Descriptor Rating Evidence 3.5 As part of the assessment cycle, post administration analyses are conducted to examine aspects such as items/tasks performance, scale functioning, overall score distribution, rater drift, content alignment, etc. 3.6 The performance measure has score validity evidence that demonstrate item responses were consistent with content specifications. Data suggest that the scores represent the intended construct by using an adequate sample of items/tasks within the targeted content standards. Other sources of validity evidence such as the interrelationship of items/tasks and alignment characteristics of the performance measure are collected. 3.7 Reliability coefficients are reported for the performance measure, which includes estimating internal consistency. Standard errors are reported for summary scores. When applicable, other reliability statistics such as classification accuracy, rater reliability, etc. are calculated and reviewed. Strand 3 Summary ___out of 7 Additional Comments/Notes