SlideShare a Scribd company logo
Traditional Pen
and Paper Tests
Prepared by: Group 3
WHAT IS A TEST?
• A set of written or
spoken questions
used for finding out
how much someone
knows about a topic
(Macmillian
Dictionary)
TYPE OF TESTS
Discussant: Loreto M. Isip Jr. and
Maria Hervie S. Autor
EDUCATIONAL TESTS
-Primary function is the
measurement of results or
effects of instruction.
Ex. Achievement Tests.
PSYCHOLOGICAL TESTS
-Measures the tangible aspects of
behaviour such as attitudes, interests,
emotional adjustment, intelligence and
ability.
• Ex. Personality Tests
MASTERY TESTS
-Achievement tests which measure the
degree to which an individual has
mastered certain instructional
objectives or specific learning
outcomes.
SURVEY TESTS
-Measure a student’s general level
of the achievement regarding a
broad range of learning outcomes.
INDIVIDUAL TESTS
-Administered on a one-to-one
basis using questioning
Ex. Individual Intelligence tests
GROUP TESTS
-Administered to
groups of individuals
POWER TESTS
• Items are arranged in increasing
order of difficulty
• Measures the individual’s ability to
answer more and more difficult item
within a given field.
SPEED TESTS
-The speed and accuracy with
which the pupil is able to
respond to the items are then
measured.
VERBAL TESTS
-Makes use of words
-Mental test consists of items measuring
vocabulary, verbal reasoning, comprehension
etc.
Examples: Verbal reasoning test, aptitude test etc.
Who is the thief in the famous Indian
play “The Little Clay Cart”?
• A. Charudatta
• B. Vasantasena
• C. Mendria
• D. Sharvilaka
You got it right!
D. Sharvilaka
NONVERBAL TESTS
-Paper and pencil tests or oral tests
-May involve drawings or physical
objects
Example: Non-verbal reasoning test
NON-VERBAL REASONING TESTS
INFORMAL TESTS
-Constructed by class-room
teachers
Ex: quizzes, long tests, etc.
STANDARDIZED TEST
-constructed by text experts,
administered and scored under
standard conditions
Ex:
NCEE, NAT
CRITERION-REFERENCED
TEST
-Compares an individual's performance to the
acceptable standard of performance
- Requires completely specified objectives.
Applications
- Diagnosis of individual skill deficiencies
- Evaluation and revision of instruction
NORM-REFERENCED TEST
- Compares an individual's performance
to the performance of others.
- Requires varying item difficulties.
Ex: College entrance exams
Types of Tests
PLANNING
(Writing Objectives, Table of
Specifications)
Discussant: Michelle Rubiso
Types of Tests
Types of Tests
Types of Tests
TABLE OF
SPECIFICATIONS
Definition
- A plan prepared by a classroom
teacher as a bases for a test
construction.
- A two-way chart which describes a
topics to be covered by a test and
the number of items or points which
will be associated with each topic.
Preparing Table of Specifications
Tables of specifications have
some commonalities. Among
them are course content,
behaviour, number of test items,
placement and percentage.
Selecting Appropriate Item Format
-Some item formats are less
appropriate than others for
measuring certain objectives.
Example #1:
“student will be able to organize
his ideas and write them in
logical and coherent fashion,”
Example #2:
“to obtain evidence of the pupils
factual recall of names, places,
dates, and events,”
Building Table of Specifications
- Preparing a list of instructional
objectives
- Outlining the course content,
- Preparing two-way chart.
OBJECTIVES
CONTENT BASIC TERMS WEATHER
SYMBOLS
SPECIFIC
FACTS
INFLUENCE OF
EACH FACTOR
ON WEATHER
FORMATION
WEATHER
MAPS
TOTAL
NUMBERS OF
ITEMS
PERCENT OF
ITEMS
Air Pressure 1 1 1 3 3 9 15
Wind 1 1 1 10 2 15 25
Temperature 1 1 1 4 2 15
Humidity and
precipitation
1 1 1 7 5 15 25
Clouds 2 2 2 6 12 20
Total number of
items
6 6 6 30 12 60
Percent of items 10 10 10 50 20 100
KNOWS UNDERSTANDS INTERPRETS
REFERENCES
• https://www.google.com.ph/search?q=blooms+taxonomy&source=lnms&tbm=isch&sa=X&ei=5
VeuU6zqCZDwoATxz4LQCg&ved=0CAYQ_AUoAQ&biw=1440&bih=775#q=blooms+taxonomy
+revision&tbm=isch&facrc=_&imgdii=_&imgrc=N0V5IEkfhQeDfM%253A%3BDmhJKtP8LsC7f
M%3Bhttp%253A%252F%252Fclickerquestions.pbworks.com%252Fw%252Ff%252FBloomin
g%252520Peacock.png%3Bhttp%253A%252F%252Fclickerquestions.pbworks.com%252Fw
%252Fpage%252F31115153%252FWriting-questions-based-on-Bloom's-
taxonomy%3B742%3B497
• http://en.wikipedia.org/wiki/Bloom's_taxonomy
• http://teaching.uncc.edu/learning-resources/articles-books/best-practice/goals-
objectives/writing-objectives
• - See more at: http://teaching.uncc.edu/learning-resources/articles-books/best-practice/goals-
objectives/writing-objectives#sthash.M2LNJ04F.dpuf
• http://www.specialconnections.ku.edu/?q=assessment/quality_test_construction/teacher_tools
/table_of_specifications
CONSTRUCTING
(DECIDING ON THE TEST FORMAT
AND WRITING TEST)
Discussant: Ayra Mae Patricia Tapaya
A. CONSTRUCTING/IMPROVING
MAIN STEM
• The main stem of the test item
may be constructed in question
form, completion form or direction
form.
QUESTION FORM
Which is the same as four hundred
seventy?
a.
b.
c.
COMPLETION FORM
Four hundred seventy is the same
as_____.
a.
b.
c.
DIRECTION FORM
Add: 22
+ 43
a.
b.
c.
• The main stem should be clear.
• The question should not be trivial.
• Questions that tap only rote learning
and memory should be avoided.
• Questions should tap only one ability.
• Each question should have only one
answer, not several possible
answers.
B. CONSTRUCTING/IMPROVING
ALTERNATIVES
• Alternatives should be as closely related to each other as
possible.
• Alternatives should be arranged in natural order.
• Alternatives should be arranged according to length: from
shortest to longest or vice versa.
• Alternatives should have grammatical parallelism.
• Arrangemant of correct answers should not follow any
pattern.
RULES FOR CONTRUCTING ALTERNATIVE-
RESPONSE ITEMS
• Avoid specific determiners.
• Avoid a disproportionate number of either true or false
statements.
• Avoid the exact wording of the textbook.
• Avoid trick statement.
• Limit each statement to the exact point to be tested.
• Avoid double negatives.
• Avoid ambiguous statements
• Avoid unfamiliar, figurative, or literary language
• Avoid long statements, especially those involving complex
sentence structures.
• Avoid quantitative language wherever possible.
• Commands cannot be “true” or “false”.
• Require the simplest possible method of indicating the
response.
• Indicate by a short line by () where the response is to be
recorded.
• Arrange the statements in groups.
RULES FOR CONSTRUCTING COMPLETION
ITEMS
• Avoid indefinite statements
• Avoid over mutilated statements
• Omit key words and phrases, rather than trivial details.
• Avoid lifting statements directly from the text.
• Make the blanks of uniform length.
• Avoid grammatical clues to correct the answer.
• Try to choose statements in which there is only one correct
response for the blanks.
• The required response should be a single word
or a brief phrase.
• Arrange the test so that the answers are in the
column at the right of the sentences.
• Avoid unordered series within an item.
• Prepare a scoring key that contains all
acceptable answers.
• Allow one point for each correctly filled blank.
SUGGESTIONS FOR CONSTRUCTING
MATCHING EXERCISES
• Be careful about what material is put into the question
column and what is put into the option column.
• Include only homogenous material in each matching
exercise.
• Check each exercise carefully for unwarranted clues that
may indicate matching parts.
• Be sure that the students fully understand the bases on
which matching is to be done.
• Out items on the left and number them, put options
on the right and designate them by letters.
• Arrange items and options in systematic order.
• Place all the items and options for a matching type
exercise on a single page, if possible.
• Limit a matching exercise to not more than 10-15
items.
SUPPLY TESTS
- require examinees to recall
and supply the answer
Ex. essay tests
USES OF ESSAY TESTS
• Assess the ability to recall, organize,
and integrate ideas.
• Assess the ability to express oneself in
writing.
• Assess student understanding of
subject matter.
ADVANTAGES OF USING ESSAY
QUESTIONS
• Allows the student to express himself in
his own words.
• Measures complex learning outcomes.
• Promotes the development of problem-
solving skills.
ADVANTAGES OF USING ESSAY QUESTIONS
• Easy and economical to administer.
• Encourages good study habits in students.
• Does not encourage guessing and cheating
during testing.
TYPES OF ESSAY QUESTIONS
1. Restricted-Response Essay Questions
• Limits both the content and response
• Useful for measuring learning outcomes
requiring interpretation and application of
data in a specific area.
Example
Describe two situations that demonstrate
the application of law and demand. Do not
use those examples discussed in the
class.
ADVANTAGES OF RESTRICTED RESPONSE
QUESTIONS
• Restricted response question is
more structured.
• Measure specific learning
outcomes.
• Provide for more ease of
assessment.
LIMITATIONS OF RESTRICTED RESPONSE
QUESTIONS
• Restricts the scope of the topic to
be discussed and indicating the
nature of the desired response
TYPES OF ESSAY QUESTIONS
2. Extended Response Essay
Questions
- Used to select information that they
think is pertinent, to organize the answer
in accordance with their best judgment,
and to integrate and evaluate ideas as
they think suitable.
EXAMPLE OF EXTENDED RESPONSE ESSAY
QUESTIONS
Imagine that you and a friend found a
magic wand. Write a story about an
adventure that you and your friend
had with the magic wand.
• .
ADVANTAGES OF EXTENDED
RESPONSE QUESTIONS
• Measures learning outcomes at the
higher cognitive levels
• Expose the individual differences in
attitudes, values and creative ability
LIMITATIONS OF EXTENDED
RESPONSE QUESTIONS
•Insufficient for measuring
knowledge of factual materials
•Scoring is usually difficult and
unreliable
Restricted-Response Essay Question Ability to:
 explain cause-effect relationships
 describe applications of principles
 present relevant arguments
 formulate tenable hypotheses
 formulate valid conclusions
 state necessary assumptions
 describe the limitations of data
 explain methods and procedures
Extended- Response Essays Ability to-
 Produce, organize and express ideas
 Integrate learning in different areas
 Create original forms (e.g., designing an
experiment
 summarize (writing a summary of story)
 construct creative stories
 explain concepts and principles
 persuade a reader
GENERAL AND SPECIFIC GUIDELINES IN
CONSTRUCTING TESTS
1. Restrict the use of essay questions
to those learning outcomes that
cannot be satisfactorily measured by
objective items.
2. Construct question that will
call forth the skills specified in
the learning standards.
Example:
Write a two page statement defending the
importance of conserving our natural
resources? (Your answer will be evaluated in
terms of its organization,
comprehensiveness, and relevance of the
arguments presented.)
3. Phrase the question so that the
student’s task is clearly indicated.
• Make it as specific as Possible
Example
Poor: Why do birds migrate?
Better: State three hypotheses that might
explain why birds migrate south in the fall.
Indicate the most probable one and give
reasons for your selection.
Example:
Poor: Compare the Democratic and
Republican parties.
Better: Compare the current
policies of the Democratic and
Republican parties with regard to
the role of government in private
business. Support your statements
with examples when possible. (Your
answer should be confined to two
pages. It will be evaluated in
terms of the appropriateness of the
facts and examples presented and
4. Indicate an approximate time and
limit for each question.
• As each question is constructed,
teacher should estimate the
approximate time needed for a
satisfactory response.
5. Avoid the use of optional questions
• The use of optional questions might
test the validity of the test results in
the other way.
SCORING ESSAY QUESTIONS
Tips to remember…
• Use clear specifications of
scoring criteria
• Inform students of scoring
criteria
• Use an initial review to find
“anchor” responses for
comparison
• Use descriptive rather than
judgmental scores or levels
(“writing is clear and thoughts
are complete” vs. “excellent”)
SCORING FOR RESTRICTED RESPONSE ESSAY
QUESTIONS
• In most instances, the
teacher should write an
example of an expected
response
• For example, if the student
is asked to describe three
factors that contributed to
the start of the Civil War,
the teacher would construct a
list of acceptable reasons
and give the student 1 point
for each of up to three
reasons given from the list
SCORING FOR EXTENDED-RESPONSE ESSAY
QUESTIONS
Analytic Scoring Rubrics
• Consist of a rubric broken
down into key dimensions
that will be evaluated
• Enables teacher to focus on
one characteristic of a
response at a time
• Provides maximum feedback
for students
Holistic Scoring Rubrics
• Yield a single overall score
taking into account the
entire response
• Can be used to grade essays
more quickly
• Does not provide as much
specific feedback as analytic
rubric
• Should not consist of scores
alone, but rather contain
scores accompanied by
statements of the
characteristics of the
response
• Example Table 10.3 and 10.4
SUGGESTIONS FOR SCORING ESSAY
QUESTIONS
• Prepare an outline of the expected
answer in advance and use a clear
scoring rubric
• Use the scoring rubric that is most
appropriate
• Decide how to handle factors that
are irrelevant to the learning
outcomes being measured
• Evaluate all responses to one
question before going on to
the next one
• When possible, evaluate
answers without looking at
the student’s name
• If especially important
decisions are to be based on
the results, obtain two or
more independent ratings
• Look out for bluffing! Page
247
ASSESSMENTS & RUBRICS |
CRESST - CRESST OFFICIAL SITE
• http://www.cse.ucla.edu/products/te
achers/highschool_scoringmanual.p
df
• http://www.cse.ucla.edu/products/as
sessments.php#
EVALUATION
DISCUSSANT:
MAIRODEN MISLANG GUEVARRA
First Tryout Third Tryout
Second Tryout
A.First Tryout
Item Analysis-
process of examining the pupils’
responses to each test item.
Specifically, what one looks for is the difficulty and
discriminating ability of the item as well as the
effectiveness of each alternative.
U-L Index Method (Stocklein, 1957)
Steps in using this method:
1. Score the papers and rank them from highest to lowest
according to the total score.
2. Separate the top 27% and the bottom 27% of the papers.
3. Tally the responses made to each test item by each
individual in the upper 27% group.
4. Tally the responses to each test item by each individual in the
lower 27% group.
U-L INDEX METHOD (STOCKLEIN, 1957)
5. Compute the percentage of the upper group that got the
item right and call it “U”.
6. Compute the percentage of the lower group that got the
item right and call it “L”.
7. Average U and L percentage and the result is the difficulty
index of the item.
8. Subtract the L percentage from the U percentage and the
result is the discrimination index.
.00 - .20 Very Difficult
.21 - .80 Moderately Difficult
.81 – 1.00 Very Easy
DIFFICULTY INDEX AND DISCRIMINATION INDEX
Difficulty Index- we mean the percentage of
the pupils who got the item right. It can also be
interrupted as how easy or how difficult an item
is.
Discrimination index- separates the bright
pupils from the poor ones. Thus, a good test
item separates the bright from the poor pupils.
B. Second Tryout
After analyzing the results of the first tryout, test
items are usually revised for improvement. After
revising those items which need revision, another
tryout is necessary.
The revised form of the test is
administered to a new set of samples. The same
conditions as in the first tryout are followed.
C. THIRD OR FINAL TRYOUT
After two revisions, the test is
considered ready to be in its final form.
The test is now in good terms of the
difficulty and the discrimination indices.
The test is ready to be tested for
reliability and validity.
ESTABLISHING THE TEST
VALIDITY
• Validity can be best defined as the
degree to which a test is capable of
achieving certain aims. It is
sometimes defined as truthfulness.
KINDS OF VALIDITY
Content Validity
Criterion- Related
Validity Construct Validity
Related to how adequately the
content of the test samples the
domain about which inferences
are to be made
Pertains to the empirical
technique of studying the
relationship between the test
and some independent
external measures (criteria).
The degree to which the test
scores can be accounted for
by certain explanatory
constructs in a psychological
theory
.

More Related Content

DOCX
action plan in project watch sample.docx
PPTX
Personality assessment
PPTX
Standardized and non standardized tests (1)
PPTX
MODULE 1 FORENSIC CHEMISTRY AND TOXICOLOGY.pptx
PPTX
Chap 7 assessment of intelligence
PPTX
BASIC RESEARCH TERMINOLOGIES
PDF
Life skills
PPT
ppt on Stress management
action plan in project watch sample.docx
Personality assessment
Standardized and non standardized tests (1)
MODULE 1 FORENSIC CHEMISTRY AND TOXICOLOGY.pptx
Chap 7 assessment of intelligence
BASIC RESEARCH TERMINOLOGIES
Life skills
ppt on Stress management

What's hot (20)

PPTX
Asl 2 lecture 3
PPTX
Assessment of Learning - Multiple Choice Test
PPTX
Assessment lecture1
PPT
Authentic Assessment
PPTX
Grading and reporting
PPTX
Selection types of objective test
PPTX
Completiontestppt
PPT
Roles of Assessment in Classroom Instruction
PPTX
Assessment of learning 1
PPTX
Instructional materials (1)
PPT
authentic vs. traditional assessment
PDF
Measurement Assessment and Evaluation in Outcomes-based Education (02 K12_HOT...
PPTX
Type of Test
PPTX
Alternative-Response Test
PPTX
Approaches About School Curriculum
PPTX
Item analysis and validation
PPTX
Constructing a true or false test
PPTX
Completion type of test
PPTX
Basic Concept in Assessment
PPTX
Curriculum Aims, Goals and Objectives
Asl 2 lecture 3
Assessment of Learning - Multiple Choice Test
Assessment lecture1
Authentic Assessment
Grading and reporting
Selection types of objective test
Completiontestppt
Roles of Assessment in Classroom Instruction
Assessment of learning 1
Instructional materials (1)
authentic vs. traditional assessment
Measurement Assessment and Evaluation in Outcomes-based Education (02 K12_HOT...
Type of Test
Alternative-Response Test
Approaches About School Curriculum
Item analysis and validation
Constructing a true or false test
Completion type of test
Basic Concept in Assessment
Curriculum Aims, Goals and Objectives
Ad

Viewers also liked (7)

PPTX
KINDS OF TEST
PPTX
Lesson 3 developing a teacher made test
PPTX
teacher made test Vs standardized test
PPTX
Types of Test
PDF
Teacher made tests
PPT
Types of test
PPTX
types of test items
KINDS OF TEST
Lesson 3 developing a teacher made test
teacher made test Vs standardized test
Types of Test
Teacher made tests
Types of test
types of test items
Ad

Similar to Types of Tests (20)

PPT
Planning the Classroom Test and Evaluation
PPT
Ssr test construction admin and scoring
PPTX
essat type question
PPTX
PGT202E - BASIC EDUCATIONAL MEASUREMENT & EVALUATION
PPT
Question Paper Setting
PPTX
Ppt on assessment of knowledge
PPTX
Placement and Achievement Test
PPTX
TEST_CONSTRUCTION.pptx
PPTX
Preparation of Classroom Assessment (SLP-B @ BISCAST)
PPTX
Assessment of-learning
PPTX
TYPES OF OBJECTIVE ITEMS.pptx
PPTX
TYPES OF OBJECTIVE ITEMS.pptx
PPTX
ASSESSMENT TOOLS- ESSAY TYPE QUESTION, CHECKLIST, RATING SCALE.pptx
PPTX
non standardized test newppt of tests units
PDF
matching type and essay type of essay.pdf
PPT
Standardized and non-standardized tests
PPTX
EVALUATING STUDENTS IN SOCIAL SCIENCE.pptx
PPTX
6. types of test items
PPTX
Different Type of Test and Question Format.pptx
PPTX
Essay question construction
Planning the Classroom Test and Evaluation
Ssr test construction admin and scoring
essat type question
PGT202E - BASIC EDUCATIONAL MEASUREMENT & EVALUATION
Question Paper Setting
Ppt on assessment of knowledge
Placement and Achievement Test
TEST_CONSTRUCTION.pptx
Preparation of Classroom Assessment (SLP-B @ BISCAST)
Assessment of-learning
TYPES OF OBJECTIVE ITEMS.pptx
TYPES OF OBJECTIVE ITEMS.pptx
ASSESSMENT TOOLS- ESSAY TYPE QUESTION, CHECKLIST, RATING SCALE.pptx
non standardized test newppt of tests units
matching type and essay type of essay.pdf
Standardized and non-standardized tests
EVALUATING STUDENTS IN SOCIAL SCIENCE.pptx
6. types of test items
Different Type of Test and Question Format.pptx
Essay question construction

Recently uploaded (20)

PPTX
STUDY DESIGN details- Lt Col Maksud (21).pptx
PPTX
Introduction to Basics of Ethical Hacking and Penetration Testing -Unit No. 1...
PPTX
Computer network topology notes for revision
PPTX
Major-Components-ofNKJNNKNKNKNKronment.pptx
PPTX
climate analysis of Dhaka ,Banglades.pptx
PPT
Chapter 3 METAL JOINING.pptnnnnnnnnnnnnn
PDF
Lecture1 pattern recognition............
PDF
Galatica Smart Energy Infrastructure Startup Pitch Deck
PDF
.pdf is not working space design for the following data for the following dat...
PDF
“Getting Started with Data Analytics Using R – Concepts, Tools & Case Studies”
PPTX
Supervised vs unsupervised machine learning algorithms
PPT
Miokarditis (Inflamasi pada Otot Jantung)
PDF
BF and FI - Blockchain, fintech and Financial Innovation Lesson 2.pdf
PPTX
Database Infoormation System (DBIS).pptx
PPTX
Moving the Public Sector (Government) to a Digital Adoption
PPTX
iec ppt-1 pptx icmr ppt on rehabilitation.pptx
PPT
Quality review (1)_presentation of this 21
PPTX
Acceptance and paychological effects of mandatory extra coach I classes.pptx
PPTX
Introduction-to-Cloud-ComputingFinal.pptx
PPTX
mbdjdhjjodule 5-1 rhfhhfjtjjhafbrhfnfbbfnb
STUDY DESIGN details- Lt Col Maksud (21).pptx
Introduction to Basics of Ethical Hacking and Penetration Testing -Unit No. 1...
Computer network topology notes for revision
Major-Components-ofNKJNNKNKNKNKronment.pptx
climate analysis of Dhaka ,Banglades.pptx
Chapter 3 METAL JOINING.pptnnnnnnnnnnnnn
Lecture1 pattern recognition............
Galatica Smart Energy Infrastructure Startup Pitch Deck
.pdf is not working space design for the following data for the following dat...
“Getting Started with Data Analytics Using R – Concepts, Tools & Case Studies”
Supervised vs unsupervised machine learning algorithms
Miokarditis (Inflamasi pada Otot Jantung)
BF and FI - Blockchain, fintech and Financial Innovation Lesson 2.pdf
Database Infoormation System (DBIS).pptx
Moving the Public Sector (Government) to a Digital Adoption
iec ppt-1 pptx icmr ppt on rehabilitation.pptx
Quality review (1)_presentation of this 21
Acceptance and paychological effects of mandatory extra coach I classes.pptx
Introduction-to-Cloud-ComputingFinal.pptx
mbdjdhjjodule 5-1 rhfhhfjtjjhafbrhfnfbbfnb

Types of Tests

  • 1. Traditional Pen and Paper Tests Prepared by: Group 3
  • 2. WHAT IS A TEST? • A set of written or spoken questions used for finding out how much someone knows about a topic (Macmillian Dictionary)
  • 3. TYPE OF TESTS Discussant: Loreto M. Isip Jr. and Maria Hervie S. Autor
  • 4. EDUCATIONAL TESTS -Primary function is the measurement of results or effects of instruction. Ex. Achievement Tests.
  • 5. PSYCHOLOGICAL TESTS -Measures the tangible aspects of behaviour such as attitudes, interests, emotional adjustment, intelligence and ability. • Ex. Personality Tests
  • 6. MASTERY TESTS -Achievement tests which measure the degree to which an individual has mastered certain instructional objectives or specific learning outcomes.
  • 7. SURVEY TESTS -Measure a student’s general level of the achievement regarding a broad range of learning outcomes.
  • 8. INDIVIDUAL TESTS -Administered on a one-to-one basis using questioning Ex. Individual Intelligence tests
  • 10. POWER TESTS • Items are arranged in increasing order of difficulty • Measures the individual’s ability to answer more and more difficult item within a given field.
  • 11. SPEED TESTS -The speed and accuracy with which the pupil is able to respond to the items are then measured.
  • 12. VERBAL TESTS -Makes use of words -Mental test consists of items measuring vocabulary, verbal reasoning, comprehension etc. Examples: Verbal reasoning test, aptitude test etc.
  • 13. Who is the thief in the famous Indian play “The Little Clay Cart”? • A. Charudatta • B. Vasantasena • C. Mendria • D. Sharvilaka
  • 14. You got it right! D. Sharvilaka
  • 15. NONVERBAL TESTS -Paper and pencil tests or oral tests -May involve drawings or physical objects Example: Non-verbal reasoning test
  • 17. INFORMAL TESTS -Constructed by class-room teachers Ex: quizzes, long tests, etc.
  • 18. STANDARDIZED TEST -constructed by text experts, administered and scored under standard conditions Ex: NCEE, NAT
  • 19. CRITERION-REFERENCED TEST -Compares an individual's performance to the acceptable standard of performance - Requires completely specified objectives. Applications - Diagnosis of individual skill deficiencies - Evaluation and revision of instruction
  • 20. NORM-REFERENCED TEST - Compares an individual's performance to the performance of others. - Requires varying item difficulties. Ex: College entrance exams
  • 22. PLANNING (Writing Objectives, Table of Specifications) Discussant: Michelle Rubiso
  • 27. Definition - A plan prepared by a classroom teacher as a bases for a test construction. - A two-way chart which describes a topics to be covered by a test and the number of items or points which will be associated with each topic.
  • 28. Preparing Table of Specifications Tables of specifications have some commonalities. Among them are course content, behaviour, number of test items, placement and percentage.
  • 29. Selecting Appropriate Item Format -Some item formats are less appropriate than others for measuring certain objectives.
  • 30. Example #1: “student will be able to organize his ideas and write them in logical and coherent fashion,”
  • 31. Example #2: “to obtain evidence of the pupils factual recall of names, places, dates, and events,”
  • 32. Building Table of Specifications - Preparing a list of instructional objectives - Outlining the course content, - Preparing two-way chart.
  • 33. OBJECTIVES CONTENT BASIC TERMS WEATHER SYMBOLS SPECIFIC FACTS INFLUENCE OF EACH FACTOR ON WEATHER FORMATION WEATHER MAPS TOTAL NUMBERS OF ITEMS PERCENT OF ITEMS Air Pressure 1 1 1 3 3 9 15 Wind 1 1 1 10 2 15 25 Temperature 1 1 1 4 2 15 Humidity and precipitation 1 1 1 7 5 15 25 Clouds 2 2 2 6 12 20 Total number of items 6 6 6 30 12 60 Percent of items 10 10 10 50 20 100 KNOWS UNDERSTANDS INTERPRETS
  • 34. REFERENCES • https://www.google.com.ph/search?q=blooms+taxonomy&source=lnms&tbm=isch&sa=X&ei=5 VeuU6zqCZDwoATxz4LQCg&ved=0CAYQ_AUoAQ&biw=1440&bih=775#q=blooms+taxonomy +revision&tbm=isch&facrc=_&imgdii=_&imgrc=N0V5IEkfhQeDfM%253A%3BDmhJKtP8LsC7f M%3Bhttp%253A%252F%252Fclickerquestions.pbworks.com%252Fw%252Ff%252FBloomin g%252520Peacock.png%3Bhttp%253A%252F%252Fclickerquestions.pbworks.com%252Fw %252Fpage%252F31115153%252FWriting-questions-based-on-Bloom's- taxonomy%3B742%3B497 • http://en.wikipedia.org/wiki/Bloom's_taxonomy • http://teaching.uncc.edu/learning-resources/articles-books/best-practice/goals- objectives/writing-objectives • - See more at: http://teaching.uncc.edu/learning-resources/articles-books/best-practice/goals- objectives/writing-objectives#sthash.M2LNJ04F.dpuf • http://www.specialconnections.ku.edu/?q=assessment/quality_test_construction/teacher_tools /table_of_specifications
  • 35. CONSTRUCTING (DECIDING ON THE TEST FORMAT AND WRITING TEST) Discussant: Ayra Mae Patricia Tapaya
  • 36. A. CONSTRUCTING/IMPROVING MAIN STEM • The main stem of the test item may be constructed in question form, completion form or direction form.
  • 37. QUESTION FORM Which is the same as four hundred seventy? a. b. c.
  • 38. COMPLETION FORM Four hundred seventy is the same as_____. a. b. c.
  • 40. • The main stem should be clear. • The question should not be trivial. • Questions that tap only rote learning and memory should be avoided. • Questions should tap only one ability. • Each question should have only one answer, not several possible answers.
  • 41. B. CONSTRUCTING/IMPROVING ALTERNATIVES • Alternatives should be as closely related to each other as possible. • Alternatives should be arranged in natural order. • Alternatives should be arranged according to length: from shortest to longest or vice versa. • Alternatives should have grammatical parallelism. • Arrangemant of correct answers should not follow any pattern.
  • 42. RULES FOR CONTRUCTING ALTERNATIVE- RESPONSE ITEMS • Avoid specific determiners. • Avoid a disproportionate number of either true or false statements. • Avoid the exact wording of the textbook. • Avoid trick statement. • Limit each statement to the exact point to be tested. • Avoid double negatives.
  • 43. • Avoid ambiguous statements • Avoid unfamiliar, figurative, or literary language • Avoid long statements, especially those involving complex sentence structures. • Avoid quantitative language wherever possible. • Commands cannot be “true” or “false”. • Require the simplest possible method of indicating the response. • Indicate by a short line by () where the response is to be recorded. • Arrange the statements in groups.
  • 44. RULES FOR CONSTRUCTING COMPLETION ITEMS • Avoid indefinite statements • Avoid over mutilated statements • Omit key words and phrases, rather than trivial details. • Avoid lifting statements directly from the text. • Make the blanks of uniform length. • Avoid grammatical clues to correct the answer. • Try to choose statements in which there is only one correct response for the blanks.
  • 45. • The required response should be a single word or a brief phrase. • Arrange the test so that the answers are in the column at the right of the sentences. • Avoid unordered series within an item. • Prepare a scoring key that contains all acceptable answers. • Allow one point for each correctly filled blank.
  • 46. SUGGESTIONS FOR CONSTRUCTING MATCHING EXERCISES • Be careful about what material is put into the question column and what is put into the option column. • Include only homogenous material in each matching exercise. • Check each exercise carefully for unwarranted clues that may indicate matching parts. • Be sure that the students fully understand the bases on which matching is to be done.
  • 47. • Out items on the left and number them, put options on the right and designate them by letters. • Arrange items and options in systematic order. • Place all the items and options for a matching type exercise on a single page, if possible. • Limit a matching exercise to not more than 10-15 items.
  • 48. SUPPLY TESTS - require examinees to recall and supply the answer Ex. essay tests
  • 49. USES OF ESSAY TESTS • Assess the ability to recall, organize, and integrate ideas. • Assess the ability to express oneself in writing. • Assess student understanding of subject matter.
  • 50. ADVANTAGES OF USING ESSAY QUESTIONS • Allows the student to express himself in his own words. • Measures complex learning outcomes. • Promotes the development of problem- solving skills.
  • 51. ADVANTAGES OF USING ESSAY QUESTIONS • Easy and economical to administer. • Encourages good study habits in students. • Does not encourage guessing and cheating during testing.
  • 52. TYPES OF ESSAY QUESTIONS 1. Restricted-Response Essay Questions • Limits both the content and response • Useful for measuring learning outcomes requiring interpretation and application of data in a specific area.
  • 53. Example Describe two situations that demonstrate the application of law and demand. Do not use those examples discussed in the class.
  • 54. ADVANTAGES OF RESTRICTED RESPONSE QUESTIONS • Restricted response question is more structured. • Measure specific learning outcomes. • Provide for more ease of assessment.
  • 55. LIMITATIONS OF RESTRICTED RESPONSE QUESTIONS • Restricts the scope of the topic to be discussed and indicating the nature of the desired response
  • 56. TYPES OF ESSAY QUESTIONS 2. Extended Response Essay Questions - Used to select information that they think is pertinent, to organize the answer in accordance with their best judgment, and to integrate and evaluate ideas as they think suitable.
  • 57. EXAMPLE OF EXTENDED RESPONSE ESSAY QUESTIONS Imagine that you and a friend found a magic wand. Write a story about an adventure that you and your friend had with the magic wand. • .
  • 58. ADVANTAGES OF EXTENDED RESPONSE QUESTIONS • Measures learning outcomes at the higher cognitive levels • Expose the individual differences in attitudes, values and creative ability
  • 59. LIMITATIONS OF EXTENDED RESPONSE QUESTIONS •Insufficient for measuring knowledge of factual materials •Scoring is usually difficult and unreliable
  • 60. Restricted-Response Essay Question Ability to:  explain cause-effect relationships  describe applications of principles  present relevant arguments  formulate tenable hypotheses  formulate valid conclusions  state necessary assumptions  describe the limitations of data  explain methods and procedures Extended- Response Essays Ability to-  Produce, organize and express ideas  Integrate learning in different areas  Create original forms (e.g., designing an experiment  summarize (writing a summary of story)  construct creative stories  explain concepts and principles  persuade a reader
  • 61. GENERAL AND SPECIFIC GUIDELINES IN CONSTRUCTING TESTS 1. Restrict the use of essay questions to those learning outcomes that cannot be satisfactorily measured by objective items.
  • 62. 2. Construct question that will call forth the skills specified in the learning standards.
  • 63. Example: Write a two page statement defending the importance of conserving our natural resources? (Your answer will be evaluated in terms of its organization, comprehensiveness, and relevance of the arguments presented.)
  • 64. 3. Phrase the question so that the student’s task is clearly indicated. • Make it as specific as Possible
  • 65. Example Poor: Why do birds migrate? Better: State three hypotheses that might explain why birds migrate south in the fall. Indicate the most probable one and give reasons for your selection.
  • 66. Example: Poor: Compare the Democratic and Republican parties. Better: Compare the current policies of the Democratic and Republican parties with regard to the role of government in private business. Support your statements with examples when possible. (Your answer should be confined to two pages. It will be evaluated in terms of the appropriateness of the facts and examples presented and
  • 67. 4. Indicate an approximate time and limit for each question. • As each question is constructed, teacher should estimate the approximate time needed for a satisfactory response.
  • 68. 5. Avoid the use of optional questions • The use of optional questions might test the validity of the test results in the other way.
  • 69. SCORING ESSAY QUESTIONS Tips to remember… • Use clear specifications of scoring criteria • Inform students of scoring criteria • Use an initial review to find “anchor” responses for comparison • Use descriptive rather than judgmental scores or levels (“writing is clear and thoughts are complete” vs. “excellent”)
  • 70. SCORING FOR RESTRICTED RESPONSE ESSAY QUESTIONS • In most instances, the teacher should write an example of an expected response • For example, if the student is asked to describe three factors that contributed to the start of the Civil War, the teacher would construct a list of acceptable reasons and give the student 1 point for each of up to three reasons given from the list
  • 71. SCORING FOR EXTENDED-RESPONSE ESSAY QUESTIONS Analytic Scoring Rubrics • Consist of a rubric broken down into key dimensions that will be evaluated • Enables teacher to focus on one characteristic of a response at a time • Provides maximum feedback for students
  • 72. Holistic Scoring Rubrics • Yield a single overall score taking into account the entire response • Can be used to grade essays more quickly • Does not provide as much specific feedback as analytic rubric • Should not consist of scores alone, but rather contain scores accompanied by statements of the characteristics of the response • Example Table 10.3 and 10.4
  • 73. SUGGESTIONS FOR SCORING ESSAY QUESTIONS • Prepare an outline of the expected answer in advance and use a clear scoring rubric • Use the scoring rubric that is most appropriate • Decide how to handle factors that are irrelevant to the learning outcomes being measured
  • 74. • Evaluate all responses to one question before going on to the next one • When possible, evaluate answers without looking at the student’s name • If especially important decisions are to be based on the results, obtain two or more independent ratings • Look out for bluffing! Page 247
  • 75. ASSESSMENTS & RUBRICS | CRESST - CRESST OFFICIAL SITE • http://www.cse.ucla.edu/products/te achers/highschool_scoringmanual.p df • http://www.cse.ucla.edu/products/as sessments.php#
  • 77. First Tryout Third Tryout Second Tryout
  • 78. A.First Tryout Item Analysis- process of examining the pupils’ responses to each test item. Specifically, what one looks for is the difficulty and discriminating ability of the item as well as the effectiveness of each alternative.
  • 79. U-L Index Method (Stocklein, 1957) Steps in using this method: 1. Score the papers and rank them from highest to lowest according to the total score. 2. Separate the top 27% and the bottom 27% of the papers. 3. Tally the responses made to each test item by each individual in the upper 27% group. 4. Tally the responses to each test item by each individual in the lower 27% group.
  • 80. U-L INDEX METHOD (STOCKLEIN, 1957) 5. Compute the percentage of the upper group that got the item right and call it “U”. 6. Compute the percentage of the lower group that got the item right and call it “L”. 7. Average U and L percentage and the result is the difficulty index of the item. 8. Subtract the L percentage from the U percentage and the result is the discrimination index.
  • 81. .00 - .20 Very Difficult .21 - .80 Moderately Difficult .81 – 1.00 Very Easy
  • 82. DIFFICULTY INDEX AND DISCRIMINATION INDEX Difficulty Index- we mean the percentage of the pupils who got the item right. It can also be interrupted as how easy or how difficult an item is. Discrimination index- separates the bright pupils from the poor ones. Thus, a good test item separates the bright from the poor pupils.
  • 83. B. Second Tryout After analyzing the results of the first tryout, test items are usually revised for improvement. After revising those items which need revision, another tryout is necessary. The revised form of the test is administered to a new set of samples. The same conditions as in the first tryout are followed.
  • 84. C. THIRD OR FINAL TRYOUT After two revisions, the test is considered ready to be in its final form. The test is now in good terms of the difficulty and the discrimination indices. The test is ready to be tested for reliability and validity.
  • 85. ESTABLISHING THE TEST VALIDITY • Validity can be best defined as the degree to which a test is capable of achieving certain aims. It is sometimes defined as truthfulness.
  • 86. KINDS OF VALIDITY Content Validity Criterion- Related Validity Construct Validity Related to how adequately the content of the test samples the domain about which inferences are to be made Pertains to the empirical technique of studying the relationship between the test and some independent external measures (criteria). The degree to which the test scores can be accounted for by certain explanatory constructs in a psychological theory
  • 87. .