1
PERFORMANCE MEASURE RUBRIC
Template #6-Performance Measure Rubric – May 2014
2
The Process for…
Assuring the quality of “Teacher-Developed”
Student Performance Measures
Introduction
The purpose of this document is to provide guidance when developing measures of
student performance that will meet the criteria within the Performance Measure Rubric. The
rubric is designed as a self-assessment tool used in ascertaining the quality of the developed
performance measure created by educators for local use. The process used to design, build, and
review teacher-made performance measures is contained within the Assessment Literacy Series
(ALS), Quick Start program.
The Quick Start program is foundational, professional training using a process of creating
high-quality performance measures. Using a step-by-step set of procedures, educators develop
performance measures focused on the identified skills, knowledge, and concepts within the
targeted content standards. The entire program can be accessed through the Homeroom online
learning platform.
Purpose Statement
The enclosed rubric is designed to examine the quality characteristics of teacher-made
performance measures. The rubric is comprised of 18 technical descriptors organized into three
(3) strands. The rubric’s purpose is to provide teachers with a self-assessment tool that assists in
building high quality measures of student achievement.
Template #6-Performance Measure Rubric – May 2014
3
Rating Tasks
Step 1. Review information, data, and documents associated with the design,
development, and review of the selected performance measure.
Step 2. Assign a value in the “Rating” column for each aspect within a particular strand
using the following scale:
a. (1) = fully addressed
b. (.5) = partially addressed
c. (0) = not addressed
d. (N/A) = not applicable at this time
Step 3. Reference supporting information associated with each assigned rating in the
“Evidence” column. All partial (.5) ratings should have statements in the
“Evidence” column identifying the shortfall(s).
Step 4. Add any additional notations and/or comments that articulate any important
nuances of the performance measure. Also note any corrective actions that may
occur in the future given the identified shortfall(s) in the technical evidence.
Step 5. Compile assigned values and place in the “Strand Summary” row.
Summary Matrix
Strand Points
Possible
Points
Earned
Design 5
Build 6
Review 7
Summary 18
Template #6-Performance Measure Rubric – May 2014
4
STRAND 1: DESIGN
Task
ID
Descriptor Rating Evidence
1.1
The purpose of the performance measure is explicitly stated
(who, what, why).
1.2
The performance measure has targeted content standards
representing a range of knowledge and skills students are
expected to know and demonstrate.
1.3
The performance measure’s design is appropriate for the
intended audience and reflects challenging material needed
to develop higher-order thinking skills.
1.4
Specification tables articulate the number of items/tasks,
item/task types, passage readability, and other information
about the performance measure -OR- Blueprints are used
to align items/tasks to targeted content standards.
1.5
Items/tasks are rigorous (designed to measure a range of
cognitive demands/higher-order thinking skills at
developmentally appropriate levels) and of sufficient
quantities to measure the depth and breadth of the targeted
content standards.
Strand 1 Summary
___out of
5
Additional Comments/Notes
Template #6-Performance Measure Rubric – May 2014
5
STRAND 2: BUILD
Task
ID
Descriptor Rating Evidence
2.1
Items/tasks and score keys are developed using
standardized procedures, including scoring rubrics for
human-scored, open-ended questions (e.g., short
constructed response, writing prompts, performance tasks,
etc.).
2.2
Item/tasks are created and reviewed in terms of: (a)
alignment to the targeted content standards, (b) content
accuracy, (c) developmental appropriateness, (d) cognitive
demand, and (e) bias, sensitivity, and fairness.
2.3
Administrative guidelines are developed that contain the
step-by-step procedures used to administer the performance
measure in a consistent manner, including scripts to orally
communicate directions to students, day and time
constraints, and allowable accommodations/adaptations.
2.4
Scoring guidelines are developed for human-scored
items/tasks to promote score consistency across items/tasks
and among different scorers. These guidelines articulate
point values for each item/task used to combine results into
an overall score.
2.5
Summary scores are reported using both raw score points
and performance level. Performance levels reflect the
range of scores possible on the assessment and use terms
or symbols to denote each level.
2.6
The total time to administer the performance measure is
developmentally appropriate for the test-taker. Generally,
this is 30 minutes or less for young students and up to 60
minutes per session for older students (high school).
Strand 2 Summary
___out of
6
Additional Comments/Notes
Template #6-Performance Measure Rubric – May 2014
6
STRAND 3: REVIEW
Task
ID
Descriptor Rating Evidence
3.1
The performance measures are reviewed in terms of design
fidelity:
 Items/tasks are distributed based upon the design
properties found within the specification or blueprint
documents;
 Item/task and form statistics are used to examine
levels of difficulty, complexity, distracter quality, and
other properties; and,
 Items/tasks and forms are rigorous and free of bias,
sensitive, or unfair characteristics.
3.2
The performance measure was reviewed in terms of
editorial soundness, while ensuring consistency and
accuracy of all documents (e.g., administration guide):
 Identifies words, text, reading passages, and/or
graphics that require copyright permission or
acknowledgements;
 Applies Universal Design principles; and,
 Ensures linguistic demands and readability is
developmentally appropriate.
3.3
The performance measure was reviewed in terms of
alignment characteristics:
 Pattern consistency (within specifications and/or
blueprints);
 Targeted content standards match;
 Cognitive demand; and
 Developmental appropriateness.
3.4
Cut scores are established for each performance level.
Performance level descriptors describe the achievement
continuum using content-based competencies for each
assessed content area.
Additional Comments/Notes
Template #6-Performance Measure Rubric – May 2014
7
Note: The indicators below are evaluated after students have taken the assessment
(i.e., post-administration).
STRAND 3: REVIEW (cont.)
Task
ID
Descriptor Rating Evidence
3.5
As part of the assessment cycle, post administration
analyses are conducted to examine aspects such as
items/tasks performance, scale functioning, overall score
distribution, rater drift, content alignment, etc.
3.6
The performance measure has score validity evidence that
demonstrate item responses were consistent with content
specifications. Data suggest that the scores represent the
intended construct by using an adequate sample of
items/tasks within the targeted content standards. Other
sources of validity evidence such as the interrelationship of
items/tasks and alignment characteristics of the
performance measure are collected.
3.7
Reliability coefficients are reported for the performance
measure, which includes estimating internal consistency.
Standard errors are reported for summary scores. When
applicable, other reliability statistics such as classification
accuracy, rater reliability, etc. are calculated and reviewed.
Strand 3 Summary
___out of
7
Additional Comments/Notes

More Related Content

DOCX
Template #3-Performance Measure Rubric-June 2014-FINAL
DOCX
Template #3a performance measure rubric-final-la
PDF
Performance measure rubric
DOCX
HO #3-Reviewing the Assessment-Scored Example-June 2014-FINAL
PDF
HO #3-Reviewing the Assessment-Scored Example-22JAN14
DOCX
Template #3 coherency rubric-final-jp
PDF
M3 Reviewing Trainer Notes
PPTX
M3 reviewing the slo-sso-final
Template #3-Performance Measure Rubric-June 2014-FINAL
Template #3a performance measure rubric-final-la
Performance measure rubric
HO #3-Reviewing the Assessment-Scored Example-June 2014-FINAL
HO #3-Reviewing the Assessment-Scored Example-22JAN14
Template #3 coherency rubric-final-jp
M3 Reviewing Trainer Notes
M3 reviewing the slo-sso-final

What's hot (20)

PPTX
M3-Reviewing the SLO-SSO-DemoSite
PDF
PID3687979
PPTX
Parcc public blueprints narrated math 04262013
PPT
Orientation for P1-3
PPTX
Tieka wilkins chapter 12
PDF
Student growth workshops
PPTX
Teacher evaluation form
PDF
Apt chapter 11
PDF
[Appendix 1 a] rpms tool for proficient teachers sy 2021 2022 in the time of ...
PDF
Results based performance management system rpms- for dep ed
PPTX
Isd model
PDF
Module 12 slideshare
PPTX
Tuning process in education
PPTX
Chapter 12 se
PPTX
M1 designing the slo-sso-final
PPTX
M1 Designing the SLO-SSO-Demo Site
PPTX
Personal reflection task
DOCX
Performance Task Framework-Help Desk - May 2014-Final
PDF
AN EMPIRICAL STUDY ON ASSESSMENT OF PO ATTAINMENT FOR A DIPLOMA PROGRAM
PPTX
Ecse ts faculty instructional guide revised 10.18.13
M3-Reviewing the SLO-SSO-DemoSite
PID3687979
Parcc public blueprints narrated math 04262013
Orientation for P1-3
Tieka wilkins chapter 12
Student growth workshops
Teacher evaluation form
Apt chapter 11
[Appendix 1 a] rpms tool for proficient teachers sy 2021 2022 in the time of ...
Results based performance management system rpms- for dep ed
Isd model
Module 12 slideshare
Tuning process in education
Chapter 12 se
M1 designing the slo-sso-final
M1 Designing the SLO-SSO-Demo Site
Personal reflection task
Performance Task Framework-Help Desk - May 2014-Final
AN EMPIRICAL STUDY ON ASSESSMENT OF PO ATTAINMENT FOR A DIPLOMA PROGRAM
Ecse ts faculty instructional guide revised 10.18.13
Ad

Similar to Template #6-Performance Measure Rubric-May 2014-Final (20)

PPT
rubricpp-131007133301-phpapp0pppppppppppppppppp1.ppt
PPTX
LESSON-3-AND-4-REPORT EDITED POWERPOINT 1
PPT
Creating Rubrics
DOCX
Quick Start Users Guide-June 2014-Working Draft
PDF
Quality Assessments-- Rubrics
PPTX
Assessment-of-Learning-Group-6.pptx pROCESS IN DEVELOPING AND USING RUBRICS
PPTX
performance_assessment.pptx
PPTX
PERFORMANCE-BASED-ASSESSMENT REPORT.pptx
PPTX
How do you conduct performance assesment.pptx
PPTX
Lesson on Learning and Assessment of Education
PPTX
Performance Assessment Scoring Rubrics_7.12.2021 (1).pptx
PPTX
Performance-Based Assessment (Assessment of Learning 2, Chapter 2))
PPTX
RUBRICS.pptx
PPT
Paratesol Workshop on Rubrics
PPTX
Performance based assessment
PPTX
Performance based assessment
PPTX
Using rubrics to inform and assess student learning final 29 october 2010
PPTX
Rubrics.pptx
PPTX
Process orientedperformance-basedassessment
PPT
Performance assessment by rubric method
rubricpp-131007133301-phpapp0pppppppppppppppppp1.ppt
LESSON-3-AND-4-REPORT EDITED POWERPOINT 1
Creating Rubrics
Quick Start Users Guide-June 2014-Working Draft
Quality Assessments-- Rubrics
Assessment-of-Learning-Group-6.pptx pROCESS IN DEVELOPING AND USING RUBRICS
performance_assessment.pptx
PERFORMANCE-BASED-ASSESSMENT REPORT.pptx
How do you conduct performance assesment.pptx
Lesson on Learning and Assessment of Education
Performance Assessment Scoring Rubrics_7.12.2021 (1).pptx
Performance-Based Assessment (Assessment of Learning 2, Chapter 2))
RUBRICS.pptx
Paratesol Workshop on Rubrics
Performance based assessment
Performance based assessment
Using rubrics to inform and assess student learning final 29 october 2010
Rubrics.pptx
Process orientedperformance-basedassessment
Performance assessment by rubric method
Ad

More from Research in Action, Inc. (20)

PPTX
SLO for teachers
PPTX
M2-Building the SLO-SSO-DemoSite
PPTX
M0 School Leader Orientation-Demo Site
PPTX
M0 Orientation to the SLO-SSO-DemoSite
DOCX
Template #2c building the sso-final
DOCX
Template #2a building the slo-final
DOCX
Template #1 designing the slo-sso-final
PPTX
M2 building the slo-sso-final
PPTX
M0 school leader orientation-final
PPTX
M0 orientation to the slo-sso-final
PPTX
Educator evaluation policy overview-final
DOCX
Performance Task Framework-June 2014-FINAL
PDF
Depth of Knowledge Chart
PDF
Assessment Selection Paper-Herman_Heritage_Goldschmidt (2011)
DOCX
Model #3-Nutrition Culinary-Level III-DEMO-FINAL
PPTX
M3-Reviewing the Assessment-June 2014-FINAL
DOCX
Cognitive Demand Crosswalk-June 2014-FINAL
DOCX
Model #2-Grade 8 Pre-Algebra-DEMO-FINAL
DOCX
Model #1-Art Grade 5-DEMO-FINAL
DOCX
HO #2-Building the Assessment-Examples-June 2014-FINAL
SLO for teachers
M2-Building the SLO-SSO-DemoSite
M0 School Leader Orientation-Demo Site
M0 Orientation to the SLO-SSO-DemoSite
Template #2c building the sso-final
Template #2a building the slo-final
Template #1 designing the slo-sso-final
M2 building the slo-sso-final
M0 school leader orientation-final
M0 orientation to the slo-sso-final
Educator evaluation policy overview-final
Performance Task Framework-June 2014-FINAL
Depth of Knowledge Chart
Assessment Selection Paper-Herman_Heritage_Goldschmidt (2011)
Model #3-Nutrition Culinary-Level III-DEMO-FINAL
M3-Reviewing the Assessment-June 2014-FINAL
Cognitive Demand Crosswalk-June 2014-FINAL
Model #2-Grade 8 Pre-Algebra-DEMO-FINAL
Model #1-Art Grade 5-DEMO-FINAL
HO #2-Building the Assessment-Examples-June 2014-FINAL

Recently uploaded (20)

PDF
Weekly quiz Compilation Jan -July 25.pdf
PDF
1.3 FINAL REVISED K-10 PE and Health CG 2023 Grades 4-10 (1).pdf
PPTX
A powerpoint presentation on the Revised K-10 Science Shaping Paper
PDF
Uderstanding digital marketing and marketing stratergie for engaging the digi...
DOC
Soft-furnishing-By-Architect-A.F.M.Mohiuddin-Akhand.doc
PDF
Trump Administration's workforce development strategy
PDF
IGGE1 Understanding the Self1234567891011
PDF
Environmental Education MCQ BD2EE - Share Source.pdf
PPTX
Introduction to pro and eukaryotes and differences.pptx
PDF
International_Financial_Reporting_Standa.pdf
PPTX
20th Century Theater, Methods, History.pptx
PPTX
Onco Emergencies - Spinal cord compression Superior vena cava syndrome Febr...
PDF
احياء السادس العلمي - الفصل الثالث (التكاثر) منهج متميزين/كلية بغداد/موهوبين
PPTX
CHAPTER IV. MAN AND BIOSPHERE AND ITS TOTALITY.pptx
PDF
My India Quiz Book_20210205121199924.pdf
PDF
medical_surgical_nursing_10th_edition_ignatavicius_TEST_BANK_pdf.pdf
PDF
HVAC Specification 2024 according to central public works department
PDF
Hazard Identification & Risk Assessment .pdf
PDF
Chinmaya Tiranga quiz Grand Finale.pdf
PDF
OBE - B.A.(HON'S) IN INTERIOR ARCHITECTURE -Ar.MOHIUDDIN.pdf
Weekly quiz Compilation Jan -July 25.pdf
1.3 FINAL REVISED K-10 PE and Health CG 2023 Grades 4-10 (1).pdf
A powerpoint presentation on the Revised K-10 Science Shaping Paper
Uderstanding digital marketing and marketing stratergie for engaging the digi...
Soft-furnishing-By-Architect-A.F.M.Mohiuddin-Akhand.doc
Trump Administration's workforce development strategy
IGGE1 Understanding the Self1234567891011
Environmental Education MCQ BD2EE - Share Source.pdf
Introduction to pro and eukaryotes and differences.pptx
International_Financial_Reporting_Standa.pdf
20th Century Theater, Methods, History.pptx
Onco Emergencies - Spinal cord compression Superior vena cava syndrome Febr...
احياء السادس العلمي - الفصل الثالث (التكاثر) منهج متميزين/كلية بغداد/موهوبين
CHAPTER IV. MAN AND BIOSPHERE AND ITS TOTALITY.pptx
My India Quiz Book_20210205121199924.pdf
medical_surgical_nursing_10th_edition_ignatavicius_TEST_BANK_pdf.pdf
HVAC Specification 2024 according to central public works department
Hazard Identification & Risk Assessment .pdf
Chinmaya Tiranga quiz Grand Finale.pdf
OBE - B.A.(HON'S) IN INTERIOR ARCHITECTURE -Ar.MOHIUDDIN.pdf

Template #6-Performance Measure Rubric-May 2014-Final

  • 2. Template #6-Performance Measure Rubric – May 2014 2 The Process for… Assuring the quality of “Teacher-Developed” Student Performance Measures Introduction The purpose of this document is to provide guidance when developing measures of student performance that will meet the criteria within the Performance Measure Rubric. The rubric is designed as a self-assessment tool used in ascertaining the quality of the developed performance measure created by educators for local use. The process used to design, build, and review teacher-made performance measures is contained within the Assessment Literacy Series (ALS), Quick Start program. The Quick Start program is foundational, professional training using a process of creating high-quality performance measures. Using a step-by-step set of procedures, educators develop performance measures focused on the identified skills, knowledge, and concepts within the targeted content standards. The entire program can be accessed through the Homeroom online learning platform. Purpose Statement The enclosed rubric is designed to examine the quality characteristics of teacher-made performance measures. The rubric is comprised of 18 technical descriptors organized into three (3) strands. The rubric’s purpose is to provide teachers with a self-assessment tool that assists in building high quality measures of student achievement.
  • 3. Template #6-Performance Measure Rubric – May 2014 3 Rating Tasks Step 1. Review information, data, and documents associated with the design, development, and review of the selected performance measure. Step 2. Assign a value in the “Rating” column for each aspect within a particular strand using the following scale: a. (1) = fully addressed b. (.5) = partially addressed c. (0) = not addressed d. (N/A) = not applicable at this time Step 3. Reference supporting information associated with each assigned rating in the “Evidence” column. All partial (.5) ratings should have statements in the “Evidence” column identifying the shortfall(s). Step 4. Add any additional notations and/or comments that articulate any important nuances of the performance measure. Also note any corrective actions that may occur in the future given the identified shortfall(s) in the technical evidence. Step 5. Compile assigned values and place in the “Strand Summary” row. Summary Matrix Strand Points Possible Points Earned Design 5 Build 6 Review 7 Summary 18
  • 4. Template #6-Performance Measure Rubric – May 2014 4 STRAND 1: DESIGN Task ID Descriptor Rating Evidence 1.1 The purpose of the performance measure is explicitly stated (who, what, why). 1.2 The performance measure has targeted content standards representing a range of knowledge and skills students are expected to know and demonstrate. 1.3 The performance measure’s design is appropriate for the intended audience and reflects challenging material needed to develop higher-order thinking skills. 1.4 Specification tables articulate the number of items/tasks, item/task types, passage readability, and other information about the performance measure -OR- Blueprints are used to align items/tasks to targeted content standards. 1.5 Items/tasks are rigorous (designed to measure a range of cognitive demands/higher-order thinking skills at developmentally appropriate levels) and of sufficient quantities to measure the depth and breadth of the targeted content standards. Strand 1 Summary ___out of 5 Additional Comments/Notes
  • 5. Template #6-Performance Measure Rubric – May 2014 5 STRAND 2: BUILD Task ID Descriptor Rating Evidence 2.1 Items/tasks and score keys are developed using standardized procedures, including scoring rubrics for human-scored, open-ended questions (e.g., short constructed response, writing prompts, performance tasks, etc.). 2.2 Item/tasks are created and reviewed in terms of: (a) alignment to the targeted content standards, (b) content accuracy, (c) developmental appropriateness, (d) cognitive demand, and (e) bias, sensitivity, and fairness. 2.3 Administrative guidelines are developed that contain the step-by-step procedures used to administer the performance measure in a consistent manner, including scripts to orally communicate directions to students, day and time constraints, and allowable accommodations/adaptations. 2.4 Scoring guidelines are developed for human-scored items/tasks to promote score consistency across items/tasks and among different scorers. These guidelines articulate point values for each item/task used to combine results into an overall score. 2.5 Summary scores are reported using both raw score points and performance level. Performance levels reflect the range of scores possible on the assessment and use terms or symbols to denote each level. 2.6 The total time to administer the performance measure is developmentally appropriate for the test-taker. Generally, this is 30 minutes or less for young students and up to 60 minutes per session for older students (high school). Strand 2 Summary ___out of 6 Additional Comments/Notes
  • 6. Template #6-Performance Measure Rubric – May 2014 6 STRAND 3: REVIEW Task ID Descriptor Rating Evidence 3.1 The performance measures are reviewed in terms of design fidelity:  Items/tasks are distributed based upon the design properties found within the specification or blueprint documents;  Item/task and form statistics are used to examine levels of difficulty, complexity, distracter quality, and other properties; and,  Items/tasks and forms are rigorous and free of bias, sensitive, or unfair characteristics. 3.2 The performance measure was reviewed in terms of editorial soundness, while ensuring consistency and accuracy of all documents (e.g., administration guide):  Identifies words, text, reading passages, and/or graphics that require copyright permission or acknowledgements;  Applies Universal Design principles; and,  Ensures linguistic demands and readability is developmentally appropriate. 3.3 The performance measure was reviewed in terms of alignment characteristics:  Pattern consistency (within specifications and/or blueprints);  Targeted content standards match;  Cognitive demand; and  Developmental appropriateness. 3.4 Cut scores are established for each performance level. Performance level descriptors describe the achievement continuum using content-based competencies for each assessed content area. Additional Comments/Notes
  • 7. Template #6-Performance Measure Rubric – May 2014 7 Note: The indicators below are evaluated after students have taken the assessment (i.e., post-administration). STRAND 3: REVIEW (cont.) Task ID Descriptor Rating Evidence 3.5 As part of the assessment cycle, post administration analyses are conducted to examine aspects such as items/tasks performance, scale functioning, overall score distribution, rater drift, content alignment, etc. 3.6 The performance measure has score validity evidence that demonstrate item responses were consistent with content specifications. Data suggest that the scores represent the intended construct by using an adequate sample of items/tasks within the targeted content standards. Other sources of validity evidence such as the interrelationship of items/tasks and alignment characteristics of the performance measure are collected. 3.7 Reliability coefficients are reported for the performance measure, which includes estimating internal consistency. Standard errors are reported for summary scores. When applicable, other reliability statistics such as classification accuracy, rater reliability, etc. are calculated and reviewed. Strand 3 Summary ___out of 7 Additional Comments/Notes