SlideShare a Scribd company logo
ADMINISTERING, ANALYZING, AND
IMPROVING THE TEST OR ASSESSMENT
NEMA GRACE B. MEDILLO
GOAL IN THIS CHAPTER:
To provide suggestion on how to
avoid common pitfalls in test assembly,
administration, and scoring.
Start
Assembling the Test
Analysing the Test Debriefing
Scoring the Test
Process of Evaluating Classroom
Achievement
Administering the Test
Nema Grace B. Medillo
ASSEMBLING THE TEST
Packaging the Test Reproducing the Test
Packaging the Test
Group together all items of similar format
Arrange test items from easy to hard
Space the items for easy reading
Keep the items and options on the same page
Position illustration near descriptions and above the item
Check your answer
Determine how the students record their answer
Provide space for Name and Date
Check test directions
Proofread the test
Reproducing the Test
Know the Photocopying Machine
Specify Copying Instructions
File Original Test
ADMINISTERING THE TEST
Maintain a Positive Attitude
Maximize Achievement Motivation
Equalize Advantages
Avoid Surprises
Clarify the Rules
Rotate Distribution
Remind Students to Check Their Copies
Monitor Students
Minimize Distractions
Give Time Warnings
Collect Test Uniformly
SCORING THE TEST
Prepare the Answer Key
Check the Answer Key
Score Blindly
Check Machine – Scored Answer Sheets
Check Scoring
Record Scores
ANALYZING THE TEST
Quantitative Item Analysis
Qualitative Item Analysis
Item Analysis Modifications for
the Criterion – Referenced Test
Item Analysis
Terminology
Item Analysis Terminology
Quantitative Item Analysis
• A numerical method for analyzing test items employing
students response alternatives or options.
Qualitative Item Analysis
• A non – numerical method for analyzing test items not
employing students responses, but considering test
objectives, content validity, and technical item quality
Key
• Correct option in a multiple – choice item
Item Analysis Terminology
Distractor
• Incorrect option in a multiple – choice item
Difficulty Index
• Proportion of students who answered the item correctly.
Discrimination Index D
• Measure of the extent to which a test item discriminates or
differentiates between students who do well on the overall
test and those who do not do well on the overall test.
Discrimination Index (D)
• Those who did well on the overall test chose the
correct answer for a particular item more often than
those who did poorly on the overall test.
Positive Discrimination Index
• Those who did poorly on the overall test chose the
correct answer for a particular item more often than
those who did well on the overall test.
Negative Discrimination Index
• Those who did well and those who did poorly on the
overall test chose the correct answer for a particular
item with equal frequency
Zero Discriminanation Index
Quantitative Item Analysis
Case Sample and
Guide Questions
Difficulty
Level
Discrimination
Index
Miskeying Guessing Ambiguity
Consider the case below
Suppose your students chose the options to a
four – alternative multiple – choice item.
Let C as the correct answer.
A B C* D
3 0 18 9
How does this information help us?
Is the item too difficult/easy for the students?
Are the distractors of the items effective?
Item X
Guide questions in quantitative item analysis
1.What is the difficulty level?
2.What is the discrimination index?
3.Should this item be eliminated?
4.Should any distractor(s) be modified?
To compute the difficulty level of an item;
A B C* D
3 0 18 9
What is the difficulty level of the item?
Do you consider the item difficult or easy? Why?
𝒑 =
𝑵𝒖𝒎𝒃𝒆𝒓 𝒔𝒆𝒍𝒆𝒄𝒕𝒊𝒏𝒈 𝒄𝒐𝒓𝒓𝒆𝒄𝒕 𝒂𝒏𝒔𝒘𝒆𝒓
𝑻𝒐𝒕𝒂𝒍 𝑵𝒖𝒎𝒃𝒆𝒓 𝒕𝒂𝒌𝒊𝒏𝒈 𝒕𝒉𝒆 𝒕𝒆𝒔𝒕
Item X
𝒑 =
𝑵𝒖𝒎𝒃𝒆𝒓 𝒔𝒆𝒍𝒆𝒄𝒕𝒊𝒏𝒈 𝒄𝒐𝒓𝒓𝒆𝒄𝒕 𝒂𝒏𝒔𝒘𝒆𝒓
𝑻𝒐𝒕𝒂𝒍 𝑵𝒖𝒎𝒃𝒆𝒓 𝒕𝒂𝒌𝒊𝒏𝒈 𝒕𝒉𝒆 𝒕𝒆𝒔𝒕
𝒑 = 𝟎. 𝟔𝟎
Solving the difficulty index
𝒑 =
𝟏𝟖
𝟑𝟎
Since the difficulty level of the item is 0. 60 (60%), the item is
moderately difficult.
A B C* D
3 0 18 9
Item X
Note:
If P level > 0.75, the item is considered relatively easy.
If P level < 0. 25, the item is considered relatively difficult.
Discrimination Index
Steps in determining Discrimination Index
1. Arrange the papers from highest to lowest score.
2. Separate the papers into an upper group and lower
3. For each item, count the number in the upper group
and the number in the lower group that chose each
alternatives
4. Record your information for each item
Example for item X ( Class Size = 30)
Options A B C* D
Upper 1 0 11 3
Lower 2 0 7 6
5. Compute D, by plugging the appropriate
numbers in the formula
What is the discrimination index of item X?
Is the discrimination index positive or negative?
Which of the groups frequently get the item correctly?
𝑫 =
𝑵𝒖𝒎𝒃𝒆𝒓 𝒘𝒉𝒐
𝒈𝒐𝒕 𝒊𝒕𝒆𝒎
𝒄𝒐𝒓𝒓𝒆𝒄𝒕 𝒊𝒏
𝒖𝒑𝒑𝒆𝒓 𝒈𝒓𝒐𝒖𝒑
−
𝑵𝒖𝒎𝒃𝒆𝒓 𝒘𝒉𝒐
𝒈𝒐𝒕 𝒊𝒕𝒆𝒎
𝒄𝒐𝒓𝒓𝒆𝒄𝒕 𝒊𝒏
𝒍𝒐𝒘𝒆𝒓 𝒈𝒓𝒐𝒖𝒑
𝑵𝒖𝒎𝒃𝒆𝒓 𝒐𝒇𝒔𝒕𝒖𝒅𝒆𝒏𝒕𝒔
𝒊𝒏 𝒆𝒊𝒕𝒉𝒆𝒓 𝒈𝒓𝒐𝒖𝒑
(if group sizes are unequal, choose the higher number)
Plugging the data
𝑫 =
𝟏𝟏 − 𝟕
𝟏𝟓
= 𝟎. 𝟐𝟔𝟕
Since the discrimination index of
item X is 0. 267, which is positive. More
students who did well on the overall test
answered the item correctly than students
who did poorly on the overall test.
Implication
Difficulty Level (p) = 0. 60
Discrimination Index (D) = 0.267
Should this item be eliminated?
The item is considered a moderately difficult
item that has positive (desirable) discrimination ability.
NO
Should any distractor(s) be modified?
Option B is ought to be modified or replaced.
A B C* D
3 0 18 9
Item X YES
(No one chose it)
Administering, analyzing, and improving the test or assessment
Let’s look at the responses for another item
Item Y ( Class size = 28)
Options A* B C D
Upper 4 1 5 4
Lower 1 7 3 3
Item Z ( Class size = 30)
Options A B* C D
Upper 3 4 3 5
Lower 0 10 2 3
1. What is the difficulty level?
2. What is the discrimination index?
3. Should this item be eliminated?
4. Should any distractor(s) be modified?
Item Y
Difficulty Level = 0.18
Discrimination Index = 0. 214
Should this item be eliminated?
Should any distractor(s) be eliminated?
No, since it is positively discriminating. However, it is
difficult item; only 18% of the class got it right.
Yes; C and D attracted more students who did well on the
test overall.
Remember:
Fewer students who do well on the test should choose
each distractor than students who do poorly.
More students who do well on the test should choose
the correct answer than students who do poorly
Difficulty Level = 0.467
Discrimination Index = - 0.40
Should this item be eliminated?
Should any distractor(s) be eliminated?
Yes! The item is moderately difficult but it
discriminate negatively.
Since we already decided to eliminate the item, this is
moot question.
Item Z
Remember:
Testing is to discriminate between those students
who know their stuff and those who do not.
MISKEYING
Most students who did well on the
test will likely select an option that is
a distractor, rather than the option that
is keyed.
Consider the miskeyed item
Who was the first astronaut
to set foot on the moon?
a. John Glenn
b. Scott Carpenter
c. Neil Armstrong
*d. Alan Sheppard
Responses
A B C D*
Upper 1 1 9 2
Most students in the upper half of the class fail to
select the keyed option.
Remember, just as you are bound to make scoring
errors, you are bound to miskey an item occasionally.
GUESSING
Likely to occur when the item measures content
that is,
• Not covered in class or the text
• So difficult that even the upper – half students
have no idea what the correct answer is
• So trivial that students are unable to choose from
among the options provided.
The choice distribution suggest that guessing occurred:
A B C* D
Upper Half 4 3 3 3
Each alternative is about equally attractive to
students in the upper half.
Ambiguity
Among the upper group, one of the distractors is
chosen with about the same frequency as the correct
answer.
The distribution suggests that an item is ambiguous:
A B C D*
Upper Half 7 0 1 7
QUALITATIVE ITEM ANALYSIS
• Matching items and objectives
• Editing poorly written items
• Improving content of the validity of the test
• Analyzing grammatical cues, specific determiners,
double negatives, multiple defensible answers, and
items that fail to match instructional objectives
ITEM ANALYSIS MODIFICATIONS FOR
THE CRITERION – REFERENCED TEST
Using Pre- and
Posttest as
Upper and
Lower Group
Comparing the
Percentage
Answering Each
Item Correctly
on Both Pre-
and Posttest
Determining the
Percentage of
Items Answered in
the Expected
Directions
Limitations of
Modifications
Using Pre- and Posttests as
Upper and Lower Groups
Pretest prior instructions
Most of the test items
answered incorrectly
Results for Lower
Group
P level = 0. 30 or lower
Post test after instructions
Most items were
answered correctly
Results for Upper
Group
P = 0.70 or higher
Analyzing sample Data
Example 1:
Number of students choosing option (n = 25)
Option At pretest (L) At posttest(U)
A 9 1
B 7 1
C 3 2
D* (Key) 6 21
Steps
1. Compute p Levels for both tests.
Pretest Posttest
𝑃 =
𝑁𝑜.𝑐ℎ𝑜𝑜𝑠𝑖𝑛𝑔 𝐶𝑜𝑟𝑟𝑒𝑐𝑡𝑙𝑦
𝑇𝑜𝑡𝑙 𝑁𝑢𝑚𝑏𝑒𝑟
6
25
21
25
= 0.24 =0.84
It was an improvement from 24% to 84%.
2. Determine the discrimination index (D) for
the key
𝐷 =
𝑁𝑢𝑚𝑏𝑒𝑟 𝑐𝑜𝑟𝑟𝑒𝑐𝑡 𝑝𝑜𝑠𝑡 −𝑁𝑢𝑚𝑏𝑒𝑟 𝑐𝑜𝑟𝑟𝑒𝑐𝑡 (𝑝𝑟𝑒)
𝑁𝑢𝑚𝑏𝑒𝑟 𝑖𝑛 𝑒𝑖𝑡ℎ𝑒𝑟 𝑔𝑟𝑜𝑢𝑝
𝐷 =
21 −6
25
=
15
25
= 0.60
The key has a positive discrimination
index.
3. Determine whether each option separately
discriminates negatively.
Option A: 𝐷 =
1−9
25
=
−8
25
= −0.32
Option B: 𝐷 =
1−7
25
=
−6
25
= −0.24
Option C: 𝐷 =
2−3
25
=
−1
25
= −0.04
Summary
1. There was sizeable increase in p value from
pretest to posttest.
2. The D index for the key was positive.
3. The distractors are all discriminated negatively.
If a criterion – reference test item manifests these
features, it has passed our “test” and a probably is a good
item with little or no need for modification.
Contrast with, the test item failed. Rather than
modify, it is probably more efficient to replace it with
another.
Comparing the Percentage Answering Each
Item Correctly on Both Pre- and Posttest
If your test is sensitive to your objectives.
What to do?
Percentage Passing Posttest – Percentage of Passing Pretest
The more positive the difference, the more the item tapping
the content you are teaching.
Analyzing Sample Data
Consider the following percentages for five test
items:
Item Percentage
passing pretest
Percentage
passing posttest
Difference
1 16 79 +63%
2 10 82 +72%
3* 75 75 0%
4 27 91 +64%
5* 67 53 -14%
Item 3 may be eliminated. (Students already know the content it
represents.)
Item 5 may be revised. (Instruction was not related to the item or
confused some students.)
Determining the Percentage of Items
Answered in the Expected Directions
for the Entire Test
Steps
1. Find the number of items each student failed on the pretest
but passed on the posttest.
Result of Mary
Item Pretest Posttest
1* Incorrect Correct
2 Correct Correct
3* Incorrect Correct
4 Correct Incorrect
5 Incorrect Incorrect
Do same things to other students.
2. Add the counts and divide by the number of
students.
18+15+22+20+13
5
=
88
5
= 17.6
Example
Mary 18 Carlos 15
Sharon 22 Amanda 20
Charles 13
3. Divide by number
of test items.
17.6
25
= 0.70
4. Multiply by 100.
0.70 x 100 = 70%
The greater the overall positive percentage of
change, the more your test is likely to match your
instruction and to be a content – valid test
Limitations of Modification
Difficult
Unit of instruction is brief
From norm – referenced test to criterion –
referenced test
Time devote to instruction (pre – post)
DEBRIEFING
Discuss Problem Items
Listen to Students Reactions
Avoid – on – the Spot Decisions
Be Equitable with Changes
Ask Students to Double – Check
Ask Students to Identify Problems
PROCESS OF EVALUATING
CLASSROOM ACHIEVEMENTS
THANK YOU…
THANK YOU…
THANK YOU…
THANK YOU…THANK YOU…
THANK YOU…
Shutting down
Administering, analyzing, and improving the test or assessment

More Related Content

PPT
Item and Distracter Analysis
PPTX
CHAPTER 6 Assessment of Learning 1
PPTX
Qualitative item analysis
PPT
Item analysis
PPTX
Item Analysis
PPT
Item Analysis
PPTX
Lesson 4 analysis of test results
PPTX
ITEM-ANALYSIS-AND-VALIDATION-in-assessment-in-learning.pptx
Item and Distracter Analysis
CHAPTER 6 Assessment of Learning 1
Qualitative item analysis
Item analysis
Item Analysis
Item Analysis
Lesson 4 analysis of test results
ITEM-ANALYSIS-AND-VALIDATION-in-assessment-in-learning.pptx

What's hot (20)

PPTX
Outcomes-Based Education (OBE)
PPTX
Completion type of test
PPTX
EXAMINING DISTRACTORS AND EFFECTIVENESS
PPTX
Assessment of Learning - Multiple Choice Test
PPTX
Function of Grading and Reporting System
PPTX
Development of classroom assessment tools
PPTX
Assessment of learning 1
PDF
Completion type of Test
PPTX
Ed8 Assessment of Learning 2
PPTX
Constructing a true or false test
PPTX
Foundations of curriculum
PPTX
Selection types of objective test
PPT
PPTX
Completiontestppt
PDF
Distracter Analysis - Index of Effectiveness
PPTX
Portfolio Assessment
PPTX
Matching test items
PPTX
Types of test questions
PPTX
Process oriented learning competencies
PPTX
Functions of Grading and Reporting System.pptx
Outcomes-Based Education (OBE)
Completion type of test
EXAMINING DISTRACTORS AND EFFECTIVENESS
Assessment of Learning - Multiple Choice Test
Function of Grading and Reporting System
Development of classroom assessment tools
Assessment of learning 1
Completion type of Test
Ed8 Assessment of Learning 2
Constructing a true or false test
Foundations of curriculum
Selection types of objective test
Completiontestppt
Distracter Analysis - Index of Effectiveness
Portfolio Assessment
Matching test items
Types of test questions
Process oriented learning competencies
Functions of Grading and Reporting System.pptx
Ad

Similar to Administering, analyzing, and improving the test or assessment (20)

PPTX
Assessment of learning1
PPTX
Assesment of Learning 1.pptx
PPT
Fdu item analysis (1).ppt revised by dd
PPTX
Item analysis and validation
PPTX
Test Administration, Test administration, Test-taking Strategies
PPTX
Item analysis report
PPT
Analyzing and using test item data
PPT
Analyzing and using test item data
PPT
Analyzing and using test item data
PPT
Analyzing and using test item data
PPTX
Item analysis2
PPTX
MODULE 7.pptx
PPTX
Measurement and evaluation 2011-09-27-10414-Item_Analysis.pptx
PDF
Item Analysis - Discrimination and Difficulty Index
PPTX
Administering, Scoring, and Analyzing Test
PDF
Improving-Test-Item.pdf in Assessment for
PPTX
Item analysis in education
PPT
Analyzingandusingtestitemdata 101012035435-phpapp02
PPTX
Improving the test items
PPTX
Chapter 6: Administering, Analyzing and Improving Tests Lesson 3 & 4
Assessment of learning1
Assesment of Learning 1.pptx
Fdu item analysis (1).ppt revised by dd
Item analysis and validation
Test Administration, Test administration, Test-taking Strategies
Item analysis report
Analyzing and using test item data
Analyzing and using test item data
Analyzing and using test item data
Analyzing and using test item data
Item analysis2
MODULE 7.pptx
Measurement and evaluation 2011-09-27-10414-Item_Analysis.pptx
Item Analysis - Discrimination and Difficulty Index
Administering, Scoring, and Analyzing Test
Improving-Test-Item.pdf in Assessment for
Item analysis in education
Analyzingandusingtestitemdata 101012035435-phpapp02
Improving the test items
Chapter 6: Administering, Analyzing and Improving Tests Lesson 3 & 4
Ad

Recently uploaded (20)

PDF
RMMM.pdf make it easy to upload and study
PDF
102 student loan defaulters named and shamed – Is someone you know on the list?
PPTX
master seminar digital applications in india
PPTX
GDM (1) (1).pptx small presentation for students
PPTX
Lesson notes of climatology university.
PDF
Sports Quiz easy sports quiz sports quiz
PDF
Microbial disease of the cardiovascular and lymphatic systems
PPTX
Pharma ospi slides which help in ospi learning
PDF
O7-L3 Supply Chain Operations - ICLT Program
PPTX
Cell Structure & Organelles in detailed.
PDF
Physiotherapy_for_Respiratory_and_Cardiac_Problems WEBBER.pdf
PDF
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
PDF
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
PPTX
Pharmacology of Heart Failure /Pharmacotherapy of CHF
PDF
Insiders guide to clinical Medicine.pdf
PDF
STATICS OF THE RIGID BODIES Hibbelers.pdf
PDF
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
PDF
TR - Agricultural Crops Production NC III.pdf
PDF
Chapter 2 Heredity, Prenatal Development, and Birth.pdf
PPTX
Renaissance Architecture: A Journey from Faith to Humanism
RMMM.pdf make it easy to upload and study
102 student loan defaulters named and shamed – Is someone you know on the list?
master seminar digital applications in india
GDM (1) (1).pptx small presentation for students
Lesson notes of climatology university.
Sports Quiz easy sports quiz sports quiz
Microbial disease of the cardiovascular and lymphatic systems
Pharma ospi slides which help in ospi learning
O7-L3 Supply Chain Operations - ICLT Program
Cell Structure & Organelles in detailed.
Physiotherapy_for_Respiratory_and_Cardiac_Problems WEBBER.pdf
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
Pharmacology of Heart Failure /Pharmacotherapy of CHF
Insiders guide to clinical Medicine.pdf
STATICS OF THE RIGID BODIES Hibbelers.pdf
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
TR - Agricultural Crops Production NC III.pdf
Chapter 2 Heredity, Prenatal Development, and Birth.pdf
Renaissance Architecture: A Journey from Faith to Humanism

Administering, analyzing, and improving the test or assessment

  • 1. ADMINISTERING, ANALYZING, AND IMPROVING THE TEST OR ASSESSMENT NEMA GRACE B. MEDILLO
  • 2. GOAL IN THIS CHAPTER: To provide suggestion on how to avoid common pitfalls in test assembly, administration, and scoring.
  • 3. Start Assembling the Test Analysing the Test Debriefing Scoring the Test Process of Evaluating Classroom Achievement Administering the Test Nema Grace B. Medillo
  • 4. ASSEMBLING THE TEST Packaging the Test Reproducing the Test
  • 5. Packaging the Test Group together all items of similar format Arrange test items from easy to hard Space the items for easy reading Keep the items and options on the same page Position illustration near descriptions and above the item Check your answer Determine how the students record their answer Provide space for Name and Date Check test directions Proofread the test
  • 6. Reproducing the Test Know the Photocopying Machine Specify Copying Instructions File Original Test
  • 7. ADMINISTERING THE TEST Maintain a Positive Attitude Maximize Achievement Motivation Equalize Advantages Avoid Surprises Clarify the Rules Rotate Distribution
  • 8. Remind Students to Check Their Copies Monitor Students Minimize Distractions Give Time Warnings Collect Test Uniformly
  • 9. SCORING THE TEST Prepare the Answer Key Check the Answer Key Score Blindly Check Machine – Scored Answer Sheets Check Scoring Record Scores
  • 10. ANALYZING THE TEST Quantitative Item Analysis Qualitative Item Analysis Item Analysis Modifications for the Criterion – Referenced Test Item Analysis Terminology
  • 11. Item Analysis Terminology Quantitative Item Analysis • A numerical method for analyzing test items employing students response alternatives or options. Qualitative Item Analysis • A non – numerical method for analyzing test items not employing students responses, but considering test objectives, content validity, and technical item quality Key • Correct option in a multiple – choice item
  • 12. Item Analysis Terminology Distractor • Incorrect option in a multiple – choice item Difficulty Index • Proportion of students who answered the item correctly. Discrimination Index D • Measure of the extent to which a test item discriminates or differentiates between students who do well on the overall test and those who do not do well on the overall test.
  • 13. Discrimination Index (D) • Those who did well on the overall test chose the correct answer for a particular item more often than those who did poorly on the overall test. Positive Discrimination Index • Those who did poorly on the overall test chose the correct answer for a particular item more often than those who did well on the overall test. Negative Discrimination Index • Those who did well and those who did poorly on the overall test chose the correct answer for a particular item with equal frequency Zero Discriminanation Index
  • 14. Quantitative Item Analysis Case Sample and Guide Questions Difficulty Level Discrimination Index Miskeying Guessing Ambiguity
  • 15. Consider the case below Suppose your students chose the options to a four – alternative multiple – choice item. Let C as the correct answer. A B C* D 3 0 18 9 How does this information help us? Is the item too difficult/easy for the students? Are the distractors of the items effective? Item X
  • 16. Guide questions in quantitative item analysis 1.What is the difficulty level? 2.What is the discrimination index? 3.Should this item be eliminated? 4.Should any distractor(s) be modified?
  • 17. To compute the difficulty level of an item; A B C* D 3 0 18 9 What is the difficulty level of the item? Do you consider the item difficult or easy? Why? 𝒑 = 𝑵𝒖𝒎𝒃𝒆𝒓 𝒔𝒆𝒍𝒆𝒄𝒕𝒊𝒏𝒈 𝒄𝒐𝒓𝒓𝒆𝒄𝒕 𝒂𝒏𝒔𝒘𝒆𝒓 𝑻𝒐𝒕𝒂𝒍 𝑵𝒖𝒎𝒃𝒆𝒓 𝒕𝒂𝒌𝒊𝒏𝒈 𝒕𝒉𝒆 𝒕𝒆𝒔𝒕 Item X
  • 18. 𝒑 = 𝑵𝒖𝒎𝒃𝒆𝒓 𝒔𝒆𝒍𝒆𝒄𝒕𝒊𝒏𝒈 𝒄𝒐𝒓𝒓𝒆𝒄𝒕 𝒂𝒏𝒔𝒘𝒆𝒓 𝑻𝒐𝒕𝒂𝒍 𝑵𝒖𝒎𝒃𝒆𝒓 𝒕𝒂𝒌𝒊𝒏𝒈 𝒕𝒉𝒆 𝒕𝒆𝒔𝒕 𝒑 = 𝟎. 𝟔𝟎 Solving the difficulty index 𝒑 = 𝟏𝟖 𝟑𝟎 Since the difficulty level of the item is 0. 60 (60%), the item is moderately difficult. A B C* D 3 0 18 9 Item X Note: If P level > 0.75, the item is considered relatively easy. If P level < 0. 25, the item is considered relatively difficult.
  • 19. Discrimination Index Steps in determining Discrimination Index 1. Arrange the papers from highest to lowest score. 2. Separate the papers into an upper group and lower 3. For each item, count the number in the upper group and the number in the lower group that chose each alternatives
  • 20. 4. Record your information for each item Example for item X ( Class Size = 30) Options A B C* D Upper 1 0 11 3 Lower 2 0 7 6
  • 21. 5. Compute D, by plugging the appropriate numbers in the formula What is the discrimination index of item X? Is the discrimination index positive or negative? Which of the groups frequently get the item correctly? 𝑫 = 𝑵𝒖𝒎𝒃𝒆𝒓 𝒘𝒉𝒐 𝒈𝒐𝒕 𝒊𝒕𝒆𝒎 𝒄𝒐𝒓𝒓𝒆𝒄𝒕 𝒊𝒏 𝒖𝒑𝒑𝒆𝒓 𝒈𝒓𝒐𝒖𝒑 − 𝑵𝒖𝒎𝒃𝒆𝒓 𝒘𝒉𝒐 𝒈𝒐𝒕 𝒊𝒕𝒆𝒎 𝒄𝒐𝒓𝒓𝒆𝒄𝒕 𝒊𝒏 𝒍𝒐𝒘𝒆𝒓 𝒈𝒓𝒐𝒖𝒑 𝑵𝒖𝒎𝒃𝒆𝒓 𝒐𝒇𝒔𝒕𝒖𝒅𝒆𝒏𝒕𝒔 𝒊𝒏 𝒆𝒊𝒕𝒉𝒆𝒓 𝒈𝒓𝒐𝒖𝒑 (if group sizes are unequal, choose the higher number)
  • 22. Plugging the data 𝑫 = 𝟏𝟏 − 𝟕 𝟏𝟓 = 𝟎. 𝟐𝟔𝟕 Since the discrimination index of item X is 0. 267, which is positive. More students who did well on the overall test answered the item correctly than students who did poorly on the overall test.
  • 23. Implication Difficulty Level (p) = 0. 60 Discrimination Index (D) = 0.267 Should this item be eliminated? The item is considered a moderately difficult item that has positive (desirable) discrimination ability. NO
  • 24. Should any distractor(s) be modified? Option B is ought to be modified or replaced. A B C* D 3 0 18 9 Item X YES (No one chose it)
  • 26. Let’s look at the responses for another item Item Y ( Class size = 28) Options A* B C D Upper 4 1 5 4 Lower 1 7 3 3 Item Z ( Class size = 30) Options A B* C D Upper 3 4 3 5 Lower 0 10 2 3 1. What is the difficulty level? 2. What is the discrimination index? 3. Should this item be eliminated? 4. Should any distractor(s) be modified?
  • 27. Item Y Difficulty Level = 0.18 Discrimination Index = 0. 214 Should this item be eliminated? Should any distractor(s) be eliminated? No, since it is positively discriminating. However, it is difficult item; only 18% of the class got it right. Yes; C and D attracted more students who did well on the test overall. Remember: Fewer students who do well on the test should choose each distractor than students who do poorly. More students who do well on the test should choose the correct answer than students who do poorly
  • 28. Difficulty Level = 0.467 Discrimination Index = - 0.40 Should this item be eliminated? Should any distractor(s) be eliminated? Yes! The item is moderately difficult but it discriminate negatively. Since we already decided to eliminate the item, this is moot question. Item Z Remember: Testing is to discriminate between those students who know their stuff and those who do not.
  • 29. MISKEYING Most students who did well on the test will likely select an option that is a distractor, rather than the option that is keyed.
  • 30. Consider the miskeyed item Who was the first astronaut to set foot on the moon? a. John Glenn b. Scott Carpenter c. Neil Armstrong *d. Alan Sheppard Responses A B C D* Upper 1 1 9 2 Most students in the upper half of the class fail to select the keyed option. Remember, just as you are bound to make scoring errors, you are bound to miskey an item occasionally.
  • 31. GUESSING Likely to occur when the item measures content that is, • Not covered in class or the text • So difficult that even the upper – half students have no idea what the correct answer is • So trivial that students are unable to choose from among the options provided.
  • 32. The choice distribution suggest that guessing occurred: A B C* D Upper Half 4 3 3 3 Each alternative is about equally attractive to students in the upper half.
  • 33. Ambiguity Among the upper group, one of the distractors is chosen with about the same frequency as the correct answer. The distribution suggests that an item is ambiguous: A B C D* Upper Half 7 0 1 7
  • 34. QUALITATIVE ITEM ANALYSIS • Matching items and objectives • Editing poorly written items • Improving content of the validity of the test • Analyzing grammatical cues, specific determiners, double negatives, multiple defensible answers, and items that fail to match instructional objectives
  • 35. ITEM ANALYSIS MODIFICATIONS FOR THE CRITERION – REFERENCED TEST Using Pre- and Posttest as Upper and Lower Group Comparing the Percentage Answering Each Item Correctly on Both Pre- and Posttest Determining the Percentage of Items Answered in the Expected Directions Limitations of Modifications
  • 36. Using Pre- and Posttests as Upper and Lower Groups Pretest prior instructions Most of the test items answered incorrectly Results for Lower Group P level = 0. 30 or lower Post test after instructions Most items were answered correctly Results for Upper Group P = 0.70 or higher
  • 37. Analyzing sample Data Example 1: Number of students choosing option (n = 25) Option At pretest (L) At posttest(U) A 9 1 B 7 1 C 3 2 D* (Key) 6 21
  • 38. Steps 1. Compute p Levels for both tests. Pretest Posttest 𝑃 = 𝑁𝑜.𝑐ℎ𝑜𝑜𝑠𝑖𝑛𝑔 𝐶𝑜𝑟𝑟𝑒𝑐𝑡𝑙𝑦 𝑇𝑜𝑡𝑙 𝑁𝑢𝑚𝑏𝑒𝑟 6 25 21 25 = 0.24 =0.84 It was an improvement from 24% to 84%.
  • 39. 2. Determine the discrimination index (D) for the key 𝐷 = 𝑁𝑢𝑚𝑏𝑒𝑟 𝑐𝑜𝑟𝑟𝑒𝑐𝑡 𝑝𝑜𝑠𝑡 −𝑁𝑢𝑚𝑏𝑒𝑟 𝑐𝑜𝑟𝑟𝑒𝑐𝑡 (𝑝𝑟𝑒) 𝑁𝑢𝑚𝑏𝑒𝑟 𝑖𝑛 𝑒𝑖𝑡ℎ𝑒𝑟 𝑔𝑟𝑜𝑢𝑝 𝐷 = 21 −6 25 = 15 25 = 0.60 The key has a positive discrimination index.
  • 40. 3. Determine whether each option separately discriminates negatively. Option A: 𝐷 = 1−9 25 = −8 25 = −0.32 Option B: 𝐷 = 1−7 25 = −6 25 = −0.24 Option C: 𝐷 = 2−3 25 = −1 25 = −0.04
  • 41. Summary 1. There was sizeable increase in p value from pretest to posttest. 2. The D index for the key was positive. 3. The distractors are all discriminated negatively. If a criterion – reference test item manifests these features, it has passed our “test” and a probably is a good item with little or no need for modification. Contrast with, the test item failed. Rather than modify, it is probably more efficient to replace it with another.
  • 42. Comparing the Percentage Answering Each Item Correctly on Both Pre- and Posttest If your test is sensitive to your objectives. What to do? Percentage Passing Posttest – Percentage of Passing Pretest The more positive the difference, the more the item tapping the content you are teaching.
  • 43. Analyzing Sample Data Consider the following percentages for five test items: Item Percentage passing pretest Percentage passing posttest Difference 1 16 79 +63% 2 10 82 +72% 3* 75 75 0% 4 27 91 +64% 5* 67 53 -14% Item 3 may be eliminated. (Students already know the content it represents.) Item 5 may be revised. (Instruction was not related to the item or confused some students.)
  • 44. Determining the Percentage of Items Answered in the Expected Directions for the Entire Test
  • 45. Steps 1. Find the number of items each student failed on the pretest but passed on the posttest. Result of Mary Item Pretest Posttest 1* Incorrect Correct 2 Correct Correct 3* Incorrect Correct 4 Correct Incorrect 5 Incorrect Incorrect Do same things to other students.
  • 46. 2. Add the counts and divide by the number of students. 18+15+22+20+13 5 = 88 5 = 17.6 Example Mary 18 Carlos 15 Sharon 22 Amanda 20 Charles 13
  • 47. 3. Divide by number of test items. 17.6 25 = 0.70 4. Multiply by 100. 0.70 x 100 = 70% The greater the overall positive percentage of change, the more your test is likely to match your instruction and to be a content – valid test
  • 48. Limitations of Modification Difficult Unit of instruction is brief From norm – referenced test to criterion – referenced test Time devote to instruction (pre – post)
  • 49. DEBRIEFING Discuss Problem Items Listen to Students Reactions Avoid – on – the Spot Decisions Be Equitable with Changes Ask Students to Double – Check Ask Students to Identify Problems
  • 51. THANK YOU… THANK YOU… THANK YOU… THANK YOU…THANK YOU… THANK YOU…

Editor's Notes

  • #2: Greetings……
  • #4: Topics to be discussed in this chapter ……..
  • #5: Assume that we already have Written measurable instructional objectives Prepared a test blueprint, specifying number of items for each content and process area Written test items that match our instructional objectives.