SlideShare a Scribd company logo
Testing Is Only Part Of The Evaluation
Of Learning
Every time you ask a question in class, monitor a student discussion, or read a
term paper, you are evaluating learning. Moreover, the evaluation process
(whether it involves examinations or not) is a valuable part of the teaching
process. The primary purpose of evaluation is to provide corrective feedback to
the student, the secondary purpose is to satisfy the administrative requirement of
ranking students on a grading scale. Owing to limitations of space, we cannot
provide an exhaustive explanation of the types of tests and rules for writing them,
but we will offer a few guidelines for each type and focus primarily on the two
most widely used types of exams: multiple-choice and essay.
The selection of material to be tested should be based on learning objectives for
the course, but the complexity of the course material associated with those
objectives (and the limited time for taking exams) means that you can only
sample the material in any given unit or course. All tests should have complete,
clearly-written instructions, time limits for each section, and point values assigned
to different questions or groups of questions. The question sheets should be
clearly typed and duplicated so that students have no difficulty reading them.
When grading exams, strive for fairness and impartiality by keeping the identity of
each student secret from yourself until you have finished the entire set of tests.
Some additional issues arise in testing in mathematics and the natural sciences,
since students are required to work problems on their exams. Answers may be
right or correct but differ in accuracy and completeness, so the type of answer
and the degree of precision you expect must be clearly specified. You must also
decide how much work the student will be required to show and how partial
credit will be allocated for incomplete answers. Keep in mind that the basic
purpose of a test is to measure student performance, and the best teachers
constantly work to refine their testing techniques and procedures. Poor
techniques may result in tests that only measure the ability to take a test - test-
wise students will perform well whether or not they know the material. Writing
good exam questions requires plenty of time for composition, review, and
revision. Also, it is beneficial to ask a colleague to review the questions before you
give the exam - another teacher might identify potential problems of
interpretation or spot confusing language.
The major weakness of multiple choice tests is that teachers may develop
questions that require only recognition or recall of information. Multiple-choice
questions in teachers' manuals that accompany textbooks often test only
recognition and recall. Strive for questions that require application of knowledge
rather than recall. For example, interpretation of data presented in charts, graphs,
maps, or other formats can form the basis for higher-level multiple-choice
questions. Multiple-choice questions normally have four or five options, to make
it difficult for students to guess the correct answer. Only one option should be
unequivocally correct; "distractors" should be unequivocally wrong. After a test
has been given, it is important to perform a test-item analysis to improve its
validity and reliability.
In matching items, the student is presented with two related lists of words or
phrases and must match those in one column with those in a longer column of
alternative responses. Obviously, one should use only homogeneous words and
phrases in a given set of items to reduce the possibility of guessing the correct
answers through elimination. For example, a list which includes names, dates, and
terms is obviously easier to match than one containing only names. Arrange the
lists in alphabetical, chronological, or some other order. Keep the lists short (ten
to twelve items) and type them on the same page of the exam.
Completion questions, short-answer questions, and essays form a continuum of
questions that require students to supply the correct answers. Completion
questions are an alternative to selection items for testing recall, but they cannot
test higher-order learning. In writing completion items, give the student sufficient
information to answer the question but not enough to give the answer away.
Questions that require students to generate their own response need clear,
unambiguous directions for the expected answer.
Students cannot answer an essay question by simply recognizing the correct
answer, nor can they study for an essay exam by memorizing factual material.
Essay questions can test complex thought processes, critical thinking, and
problem-solving skills, and essays require students to use the English language to
communicate in sentences and paragraphs - a skill that undergraduates need to
exercise more frequently. But essay questions which require no more than a
regurgitation of facts do not measure higher-order learning.
Although these guidelines are written from the perspective of the social sciences
and humanities, most of these rules also apply to devising long problems in
science courses. Since one of the advantages of essay questions is their ability to
test elements of higher-order learning, your first task is to define the type of
learning you expect to measure. If you wish to test problem-solving skills, the
format and method for solving the problems must be clearly communicated to
students. Presenting problems with no clues about how to proceed may cause
students to adopt a plausible but incorrect approach, even if they knew how to
solve the problem in the correct way.
It is helpful to distinguish between essay questions that require objectively
verifiable answers and those that ask students to express their opinions,
attitudes, or creativity. The latter are more difficult to construct and evaluate
because it is more difficult to specify grading criteria (they therefore tend to be
less valid measures of performance).
The reliability of essay questions can be increased by paying close attention to the
criteria for answers. Many teachers don't realize that it is not only necessary to
compose a model answer, but to provide students with instructions that will elicit
the desired answer. First, write an outline of your best approximation of the
correct answer, with all of its sections in place. When you have read over your
answer several times and are satisfied that it will measure the appropriate course
objective, write the instructions students will need to answer the question with
the scope and direction you intend. Describe the expected length of the answer,
its form and structure, and any special elements that should be present. Good
grading practices also increase the reliability of essay tests. Research has shown
that the scoring of essays is usually unreliable; scores not only vary across
different graders, they vary with the individual grader at different times. If the
grader knows the identity of the student, his/her overall impressions of that
student's work will inevitably influence the scoring of the test.
If, through some quirk in wording, students misinterpret your intent, or if your
standards are unrealistically high or low, you can alter the key in light of this
information. If these problems are not in evidence, and you have carefully
constructed the model answer, students should not be able to surprise you with
better answers than yours. However, you should be open to legitimate
interpretations of the questions different from your own. Finally, unless you
intend to grade grammar, syntax, spelling, and punctuation as part of the
examination, try to overlook flaws in composition and focus instead on the
accuracy and completeness of the answers.
It is important to write comments on test papers as you grade them, but
comments do not have to be extensive to be effective. For example, you might
assess penalties for incorrect statements, omission of relevant material, inclusion
of irrelevant material, and errors in logic that lead to unsound conclusions.
Distributing your model answers with the corrected essays can alleviate some of
the burden of writing comments on exams.
Jeff C. Palmer is a teacher, success coach, trainer, Certified Master of Web
Copywriting and founder of https://Ebookschoice.com. Jeff is a prolific writer,
Senior Research Associate and Infopreneur having written many eBooks, articles
and special reports.
Source: https://ebookschoice.com/testing-is-only-part-of-the-evaluation-of-
learning/

More Related Content

DOCX
Subjective test
DOCX
Constructing Tests
PPTX
TEST_CONSTRUCTION.pptx
DOCX
Creating exams
DOCX
Teacher
PPTX
Testing & bem guide 2018
PDF
Testing & examiner guide 2018 teacher's hand out oued semar a lgiers
PPTX
Non standardized tests
Subjective test
Constructing Tests
TEST_CONSTRUCTION.pptx
Creating exams
Teacher
Testing & bem guide 2018
Testing & examiner guide 2018 teacher's hand out oued semar a lgiers
Non standardized tests

Similar to Testing Is Only Part Of The Evaluation Of Learning (20)

PDF
Testing teacher's hand testing & examiner guide 2018
PPTX
about testing methods in teaching English
PPTX
Type of Test
PPTX
PPTX
Evaluation and scoring essay tests
DOCX
Assessment tools
PDF
eu-magazine.pdf
PPTX
Essay assessment
PPTX
assembling adminsitering and marking the test
PPTX
Identifying Test Objective (Assessment of Learning) - CES report 011114
PPTX
Essaytypetest
PPTX
EVALUATING STUDENTS IN SOCIAL SCIENCE.pptx
DOCX
Powerpoint presentation about test development process
DOCX
PPTX
Pathology Assessment and review questions
PPTX
Construction of essay type questions ppt.pptx
PPTX
Lesson-5.pptx
PPTX
Buckingham Uni PGCE Feb 2017 Assessment
PDF
The Evaluation Of Teaching Has Become A Widely Accepted Practice
PPTX
Essay Test
Testing teacher's hand testing & examiner guide 2018
about testing methods in teaching English
Type of Test
Evaluation and scoring essay tests
Assessment tools
eu-magazine.pdf
Essay assessment
assembling adminsitering and marking the test
Identifying Test Objective (Assessment of Learning) - CES report 011114
Essaytypetest
EVALUATING STUDENTS IN SOCIAL SCIENCE.pptx
Powerpoint presentation about test development process
Pathology Assessment and review questions
Construction of essay type questions ppt.pptx
Lesson-5.pptx
Buckingham Uni PGCE Feb 2017 Assessment
The Evaluation Of Teaching Has Become A Widely Accepted Practice
Essay Test
Ad

Recently uploaded (20)

PPTX
Institutional Correction lecture only . . .
PPTX
Lesson notes of climatology university.
PDF
Insiders guide to clinical Medicine.pdf
PDF
O7-L3 Supply Chain Operations - ICLT Program
PPTX
Cell Types and Its function , kingdom of life
PPTX
GDM (1) (1).pptx small presentation for students
PDF
Computing-Curriculum for Schools in Ghana
PPTX
master seminar digital applications in india
PPTX
Final Presentation General Medicine 03-08-2024.pptx
PDF
Pre independence Education in Inndia.pdf
PDF
Classroom Observation Tools for Teachers
PDF
STATICS OF THE RIGID BODIES Hibbelers.pdf
PPTX
Microbial diseases, their pathogenesis and prophylaxis
PPTX
Pharma ospi slides which help in ospi learning
PDF
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
PPTX
IMMUNITY IMMUNITY refers to protection against infection, and the immune syst...
PDF
102 student loan defaulters named and shamed – Is someone you know on the list?
PPTX
PPT- ENG7_QUARTER1_LESSON1_WEEK1. IMAGERY -DESCRIPTIONS pptx.pptx
PDF
Complications of Minimal Access Surgery at WLH
PDF
Module 4: Burden of Disease Tutorial Slides S2 2025
Institutional Correction lecture only . . .
Lesson notes of climatology university.
Insiders guide to clinical Medicine.pdf
O7-L3 Supply Chain Operations - ICLT Program
Cell Types and Its function , kingdom of life
GDM (1) (1).pptx small presentation for students
Computing-Curriculum for Schools in Ghana
master seminar digital applications in india
Final Presentation General Medicine 03-08-2024.pptx
Pre independence Education in Inndia.pdf
Classroom Observation Tools for Teachers
STATICS OF THE RIGID BODIES Hibbelers.pdf
Microbial diseases, their pathogenesis and prophylaxis
Pharma ospi slides which help in ospi learning
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
IMMUNITY IMMUNITY refers to protection against infection, and the immune syst...
102 student loan defaulters named and shamed – Is someone you know on the list?
PPT- ENG7_QUARTER1_LESSON1_WEEK1. IMAGERY -DESCRIPTIONS pptx.pptx
Complications of Minimal Access Surgery at WLH
Module 4: Burden of Disease Tutorial Slides S2 2025
Ad

Testing Is Only Part Of The Evaluation Of Learning

  • 1. Testing Is Only Part Of The Evaluation Of Learning Every time you ask a question in class, monitor a student discussion, or read a term paper, you are evaluating learning. Moreover, the evaluation process (whether it involves examinations or not) is a valuable part of the teaching process. The primary purpose of evaluation is to provide corrective feedback to the student, the secondary purpose is to satisfy the administrative requirement of ranking students on a grading scale. Owing to limitations of space, we cannot provide an exhaustive explanation of the types of tests and rules for writing them, but we will offer a few guidelines for each type and focus primarily on the two most widely used types of exams: multiple-choice and essay. The selection of material to be tested should be based on learning objectives for the course, but the complexity of the course material associated with those objectives (and the limited time for taking exams) means that you can only
  • 2. sample the material in any given unit or course. All tests should have complete, clearly-written instructions, time limits for each section, and point values assigned to different questions or groups of questions. The question sheets should be clearly typed and duplicated so that students have no difficulty reading them. When grading exams, strive for fairness and impartiality by keeping the identity of each student secret from yourself until you have finished the entire set of tests. Some additional issues arise in testing in mathematics and the natural sciences, since students are required to work problems on their exams. Answers may be right or correct but differ in accuracy and completeness, so the type of answer and the degree of precision you expect must be clearly specified. You must also decide how much work the student will be required to show and how partial credit will be allocated for incomplete answers. Keep in mind that the basic purpose of a test is to measure student performance, and the best teachers constantly work to refine their testing techniques and procedures. Poor techniques may result in tests that only measure the ability to take a test - test- wise students will perform well whether or not they know the material. Writing good exam questions requires plenty of time for composition, review, and revision. Also, it is beneficial to ask a colleague to review the questions before you give the exam - another teacher might identify potential problems of interpretation or spot confusing language. The major weakness of multiple choice tests is that teachers may develop questions that require only recognition or recall of information. Multiple-choice questions in teachers' manuals that accompany textbooks often test only recognition and recall. Strive for questions that require application of knowledge rather than recall. For example, interpretation of data presented in charts, graphs, maps, or other formats can form the basis for higher-level multiple-choice questions. Multiple-choice questions normally have four or five options, to make it difficult for students to guess the correct answer. Only one option should be unequivocally correct; "distractors" should be unequivocally wrong. After a test has been given, it is important to perform a test-item analysis to improve its validity and reliability. In matching items, the student is presented with two related lists of words or phrases and must match those in one column with those in a longer column of alternative responses. Obviously, one should use only homogeneous words and phrases in a given set of items to reduce the possibility of guessing the correct answers through elimination. For example, a list which includes names, dates, and
  • 3. terms is obviously easier to match than one containing only names. Arrange the lists in alphabetical, chronological, or some other order. Keep the lists short (ten to twelve items) and type them on the same page of the exam. Completion questions, short-answer questions, and essays form a continuum of questions that require students to supply the correct answers. Completion questions are an alternative to selection items for testing recall, but they cannot test higher-order learning. In writing completion items, give the student sufficient information to answer the question but not enough to give the answer away. Questions that require students to generate their own response need clear, unambiguous directions for the expected answer. Students cannot answer an essay question by simply recognizing the correct answer, nor can they study for an essay exam by memorizing factual material. Essay questions can test complex thought processes, critical thinking, and problem-solving skills, and essays require students to use the English language to communicate in sentences and paragraphs - a skill that undergraduates need to exercise more frequently. But essay questions which require no more than a regurgitation of facts do not measure higher-order learning. Although these guidelines are written from the perspective of the social sciences and humanities, most of these rules also apply to devising long problems in science courses. Since one of the advantages of essay questions is their ability to test elements of higher-order learning, your first task is to define the type of learning you expect to measure. If you wish to test problem-solving skills, the format and method for solving the problems must be clearly communicated to students. Presenting problems with no clues about how to proceed may cause students to adopt a plausible but incorrect approach, even if they knew how to solve the problem in the correct way. It is helpful to distinguish between essay questions that require objectively verifiable answers and those that ask students to express their opinions, attitudes, or creativity. The latter are more difficult to construct and evaluate because it is more difficult to specify grading criteria (they therefore tend to be less valid measures of performance). The reliability of essay questions can be increased by paying close attention to the criteria for answers. Many teachers don't realize that it is not only necessary to compose a model answer, but to provide students with instructions that will elicit
  • 4. the desired answer. First, write an outline of your best approximation of the correct answer, with all of its sections in place. When you have read over your answer several times and are satisfied that it will measure the appropriate course objective, write the instructions students will need to answer the question with the scope and direction you intend. Describe the expected length of the answer, its form and structure, and any special elements that should be present. Good grading practices also increase the reliability of essay tests. Research has shown that the scoring of essays is usually unreliable; scores not only vary across different graders, they vary with the individual grader at different times. If the grader knows the identity of the student, his/her overall impressions of that student's work will inevitably influence the scoring of the test. If, through some quirk in wording, students misinterpret your intent, or if your standards are unrealistically high or low, you can alter the key in light of this information. If these problems are not in evidence, and you have carefully constructed the model answer, students should not be able to surprise you with better answers than yours. However, you should be open to legitimate interpretations of the questions different from your own. Finally, unless you intend to grade grammar, syntax, spelling, and punctuation as part of the examination, try to overlook flaws in composition and focus instead on the accuracy and completeness of the answers. It is important to write comments on test papers as you grade them, but comments do not have to be extensive to be effective. For example, you might assess penalties for incorrect statements, omission of relevant material, inclusion of irrelevant material, and errors in logic that lead to unsound conclusions. Distributing your model answers with the corrected essays can alleviate some of the burden of writing comments on exams. Jeff C. Palmer is a teacher, success coach, trainer, Certified Master of Web Copywriting and founder of https://Ebookschoice.com. Jeff is a prolific writer, Senior Research Associate and Infopreneur having written many eBooks, articles and special reports. Source: https://ebookschoice.com/testing-is-only-part-of-the-evaluation-of- learning/