SlideShare a Scribd company logo
Developing Affective Constructs



               Dr. Carlo Magno
 Counseling and Educational Psychology Department
           De La Salle University-Manila



                                                    1
Affective characteristics
• Anderson (1981) explained affective characteristics as
  “qualities which presents people’s typical ways of
  feeling, or expressing their emotions” (p. 3).
• Sta. Maria and Magno (2007) found that affective
  characteristics run on two dimensions: Intensity and
  direction.
• Intensity refers to the strength of the characteristic
  expressed.
• Direction of affect refers to the cause of the affect from
  object external factors to person factors.
                                                           2
Dimensions of Affect




                       3
Classifications of Affective Scales
• Attitude. Learned predispositions to respond in a
   consistently favorable or unfavorable manner with
   respect to a given object (Meece et al. 1982).
• Attitude Towards Church Scale” by Thurstone and
   Chave (1929):
1. I think the teaching of the church is altogether too
   superficial to have much social and significance.
2. I feel the church services give me inspiration and help
   me to live up to my best during the following week.

                                                             4
Classifications of Affective Scales
• Beliefs. Judgments and evaluations that we make about
  ourselves, about others, and about the world around us
  (Dilts, 1999).
• Examples of belief statements:
• “A quiet classroom is conducive to learning”
• “Studying longer will improve a student’s score on the
  test”
• “Grades encourage students to work harder”
• Interests. "a liking/disliking state of mind
  accompanying the doing of an activity" (Strong, 1955,
  p. 138).
                                                       5
Classifications of Affective Scales
• Values. “the principles and fundamental convictions
  which act as general guides to behavior, the standards
  by which particular actions are judged to be good or
  desirable (Halstead & Taylor, 2000, p. 169).
• Dispositions are guided by beliefs and attitudes related
  to values such as caring, fairness, honesty,
  responsibility, and social justice. Examples of
  dispositions include fairness, being democratic,
  empathy, enthusiasm, thoughtfulness, and
  respectfulness.

                                                             6
Steps in Constructing Non-Cognitive
                 Measures
• Decide what information should be sought
   – (1) No scales are available to measure such construct
   – (2) All scales are foreign and it is not suitable for the
     stakeholders or sample that will take the measure
   – (3) Existing measures are not appropriate for the purpose of
     assessment
   – (4) The test developer intends to explore the underlying
     factors of a construct and eventually confirm it
• Search for Content Domain:
   – Search for relevant literature reviews
   – Look for the appropriate definition
   – Explain the theory
   – Specify the underlying variables (deconstruction)
                                                                    7
Steps in Constructing Non-Cognitive
             Measures
                    Subscale 1


                    Subscale 2


                    Subscale 3
    Factor

                    Subscale 4


                    Subscale 5




                                      8
Steps in Constructing Non-Cognitive
             Measures

                            item 1

           Factor1          Item 2


                            item 3


                            item 4
           Factor2

                            item 5




                                      9
Steps in Constructing Non-Cognitive
                 Measures
• Write the first draft of items:
• Items are created for each subscale as guided by the conceptual
  definition.
• The number of items as planned in the Table of Specifications
  is also considered.
• As much as possible, a large number of items are written to
  represent well the behavior being measured.
• How to write Items:
   –   Items are based on the definition of the subscales
   –   Provide the manifestation of the construct
   –   Descriptions from references
   –   Conduct an open ended surveys, interview, FGD’s


                                                                10
Steps in Constructing Non-Cognitive
                 Measures
Good questionnaire items should:
1. Include a vocabulary that is simple, direct, and familiar to all
   respondents
2. Be clear and specific
3. Not involve leading, loaded or double barreled questions
4. Be as short as possible
5. Include all conditional information prior to the key ideas
6. Be edited for readability
7. Generalizable for a large sample.
8. Avoid time-bound situations.


                                                                      11
Steps in Constructing Non-Cognitive
                 Measures
• Example of bad items:
• I am satisfied with my wages and hours at the place where I
  work. (Double Barreled)
• I not in favor congress passing a law not allowing any employer
  to force any employee to retire
• at any age. (Double Negative)
• Most people favor death penalty. What do you think? (Leading
  Question)




                                                                12
Steps in Constructing Non-Cognitive
                 Measures
• Select a scaling technique:
• After writing the items, the test developer decides on the appropriate
  response format to be used in the scale.
• The most common response formats used:
   – Lickert scale (measure of position in an opinion)
   – Verbal frequency scale (measure of a habit)
   – Ordinal scale (ordering of responses)
   – Linear numeric scale (judging a single dimension in an array)
• Develop directions for responding:
• Directions or instructions for the target respondents be created as early as
  when the items are created.
• Clear and concise.
• Respondents should be informed how to answer.
• When you intend to have a separate answer sheet, make sure to inform the
  respondents about it in the instructions.
• Instructions should also include ways of changing answers, how to answer
  (encircle, check, or shade).
• Inform the respondents in the instructions specifically what they need to do.
                                                                              13
Steps in Constructing Non-Cognitive
                Measures
• Conduct a judgmental review of items
• Have experts review your items.




                                         14
Steps in Constructing Non-Cognitive
                  Measures
•   Reexamine and revise the questionnaire
•   Prepare a draft and gather preliminary pilot data:
•   Requires a layout of the test for the respondents.
•   Make the scale as easy as possible to use.
•   Each item can be identified with a number or a letter to
    facilitate scoring of responses later.
•   The items should be structured for readability and recording
    responses.
•   Whenever possible items with the same response formats are
    placed together.
•   In designing self-administered scales, it is suggested to make it
    visually appealing to increase response rate.
•   Self-explanatory and the respondents can complete it in a short
    time.
•   Ordering of items: The first few questions set the tone for the
    rest of the items and determine how willingly and                 15
    conscientiously respondents will work on subsequent questions.
Steps in Constructing Non-Cognitive
                 Measures
• Analyze Pilot data:
• The responses in the scale should be recorded using a
  spreadsheet.
• The numerical responses are then analyzed.
• The analysis consists of determining whether the test is reliable
  or valid.
• Revise the Instrument:
• The instrument is then revised because items with low factor
  loadings are removed
• Items when removed will increase Cronbach’s alpha.



                                                                  16
Steps in Constructing Non-Cognitive
                 Measures
• Gather final pilot data
• A large sample is again selected which is three times the
  number of items.
• Conduct Additional Validity and Reliability Analysis
• The validity and reliability is again analyzed using the new pilot
  data.
• Edit the questionnaire and specify the procedures for its use
• Items with low factor loadings are again removed resulting to
  less items.
• A new form of the test with reduced items will be formed.
• Prepare the Test Manual
• The test manual indicates the purpose of the test, instructions in
  administering, procedure for scoring, interpreting the scores
                                                                    17
  including the norms.
Examples of Response Formats
• Multiple response item:




• Single-response item




                                       18
Examples of Response Formats

• Lickert Scale




                                        19
Examples of Response Formats

• Verbal frequency scale




                                       20
Examples of Response Formats

• Ordinal scale




                                       21
Examples of Response Formats
• Forced Ranking Scale




• Paired Comparison Scale




                                       22
Examples of Response Formats

• Comparative Scale




                                       23
Examples of Response Formats

• Linear Numeric Scale




                                       24
Examples of Response Formats

• Semantic differential scale




                                        25
Examples of Response Formats

• Adjective checklist




• Semantic distance scale




                                        26
Examples of Response Formats

• Fixed sum scale




                                      27
Examples of Response Formats

• Multiple rating list




                                        28
Examples of Response Formats

• Multiple rating matrix




                                        29
Examples of Response Formats

• Diagram scale




                                      30
Examples of Response Formats
• Graphic scale




                                         31

More Related Content

PPTX
Building bridges across disciplines in basic education
PPTX
Practicality and-efficiency
PPTX
Communication skills for the 21st century classroom
PPTX
Lesson 3 developing a teacher made test
PPTX
Chapter 5 product-oriented performance-based assessment
PPTX
Assessment in the Social Studies Curriculum
PPTX
Affective assessment
PPTX
Process oriented learning competencies
Building bridges across disciplines in basic education
Practicality and-efficiency
Communication skills for the 21st century classroom
Lesson 3 developing a teacher made test
Chapter 5 product-oriented performance-based assessment
Assessment in the Social Studies Curriculum
Affective assessment
Process oriented learning competencies

What's hot (20)

DOCX
Module 4, ed 103
PPT
Content,performance standard
PPTX
Test construction edited
DOCX
Table of Specification template 2 (Sample)
PPTX
Objective test
PPTX
Assessment in Kto12 Education - KPUP
PDF
KATUTURAN AT LIMANG TEMA NG HEOGRAPIYA
PPTX
CUrriculum Experiences
PPTX
performance based -product oriented assessment
PPTX
Parts of the action research
PPTX
The AIM Model
DOCX
Questionnaire Social media as educational tool
PPTX
Alternate-Response
PPTX
Action Research
PPTX
Test Construction
PPT
Integrated curriculum
DOCX
PROF ED.docx
PDF
Quantitative Research Instruments
PPTX
Approaches to Curriculum Design
PPTX
Integrative approach
Module 4, ed 103
Content,performance standard
Test construction edited
Table of Specification template 2 (Sample)
Objective test
Assessment in Kto12 Education - KPUP
KATUTURAN AT LIMANG TEMA NG HEOGRAPIYA
CUrriculum Experiences
performance based -product oriented assessment
Parts of the action research
The AIM Model
Questionnaire Social media as educational tool
Alternate-Response
Action Research
Test Construction
Integrated curriculum
PROF ED.docx
Quantitative Research Instruments
Approaches to Curriculum Design
Integrative approach
Ad

Similar to Developing affective constructs (20)

PPT
Scale development -- Steps
PPTX
ODP
Survey design-workshop-1234170716539145-3[1]
PDF
Bmgt 311 chapter_8
PPT
How to conduct a questionnaire for a scientific survey
PPTX
Methodology and IRB/URR
PPTX
Item Analysis and scaling methods...pptx
PPTX
Authentic Tasks
PDF
Lecture_4_Data_Gathering_and_Analysis.pdf
PPTX
Observation and Research: Session 1 (Blended TEFL course)
PPTX
Business Research Methods Unit II
PPT
Measurement in Marketing Research
PDF
ICAR-IFPRI - Basic Research Questions lecture 1 - Devesh Roy, IFPRI
PPT
Methodology ' ' ' ' ' ' ' ' '
PPTX
Issue-based metrics
PPTX
Steps in involved Scale Development.pptx
PPTX
Psychological assessments and tests.pptx
PPTX
Psychological assessments and tests.pptx
PPT
Ch05 Concepts, Operationalization, and Measurement
PPTX
Operational research ppt
Scale development -- Steps
Survey design-workshop-1234170716539145-3[1]
Bmgt 311 chapter_8
How to conduct a questionnaire for a scientific survey
Methodology and IRB/URR
Item Analysis and scaling methods...pptx
Authentic Tasks
Lecture_4_Data_Gathering_and_Analysis.pdf
Observation and Research: Session 1 (Blended TEFL course)
Business Research Methods Unit II
Measurement in Marketing Research
ICAR-IFPRI - Basic Research Questions lecture 1 - Devesh Roy, IFPRI
Methodology ' ' ' ' ' ' ' ' '
Issue-based metrics
Steps in involved Scale Development.pptx
Psychological assessments and tests.pptx
Psychological assessments and tests.pptx
Ch05 Concepts, Operationalization, and Measurement
Operational research ppt
Ad

More from Carlo Magno (20)

PPTX
Intervetntions-based assessment - supevisors - private schools.pptx
PPTX
Assessment Using the SOLO Framework.pptx
PPTX
Social and Emotional Learning
PPTX
Educational assessment in the 4 ir
PPTX
The process of research mentoring
PPTX
Quality management services sustainability training
PPTX
Managing technology integration in schools
PPTX
Integrating technology in teaching
PPTX
Empowering educators on technology integration
PPTX
Designing an online lesson
PPTX
Curriculum integration
PPTX
Accountability in Developing Student Learning
PPTX
The Instructional leader: TOwards School Improvement
PPTX
Guiding your child on their career decision making
PPTX
Assessing Science Inquiry Skills
PPTX
Quantitative analysis in language research
PPTX
Integrating technology in teaching
PPTX
Hallmarks of textbook
PDF
managing the learner centered-classroom
PDF
Assessing learning objectives
Intervetntions-based assessment - supevisors - private schools.pptx
Assessment Using the SOLO Framework.pptx
Social and Emotional Learning
Educational assessment in the 4 ir
The process of research mentoring
Quality management services sustainability training
Managing technology integration in schools
Integrating technology in teaching
Empowering educators on technology integration
Designing an online lesson
Curriculum integration
Accountability in Developing Student Learning
The Instructional leader: TOwards School Improvement
Guiding your child on their career decision making
Assessing Science Inquiry Skills
Quantitative analysis in language research
Integrating technology in teaching
Hallmarks of textbook
managing the learner centered-classroom
Assessing learning objectives

Developing affective constructs

  • 1. Developing Affective Constructs Dr. Carlo Magno Counseling and Educational Psychology Department De La Salle University-Manila 1
  • 2. Affective characteristics • Anderson (1981) explained affective characteristics as “qualities which presents people’s typical ways of feeling, or expressing their emotions” (p. 3). • Sta. Maria and Magno (2007) found that affective characteristics run on two dimensions: Intensity and direction. • Intensity refers to the strength of the characteristic expressed. • Direction of affect refers to the cause of the affect from object external factors to person factors. 2
  • 4. Classifications of Affective Scales • Attitude. Learned predispositions to respond in a consistently favorable or unfavorable manner with respect to a given object (Meece et al. 1982). • Attitude Towards Church Scale” by Thurstone and Chave (1929): 1. I think the teaching of the church is altogether too superficial to have much social and significance. 2. I feel the church services give me inspiration and help me to live up to my best during the following week. 4
  • 5. Classifications of Affective Scales • Beliefs. Judgments and evaluations that we make about ourselves, about others, and about the world around us (Dilts, 1999). • Examples of belief statements: • “A quiet classroom is conducive to learning” • “Studying longer will improve a student’s score on the test” • “Grades encourage students to work harder” • Interests. "a liking/disliking state of mind accompanying the doing of an activity" (Strong, 1955, p. 138). 5
  • 6. Classifications of Affective Scales • Values. “the principles and fundamental convictions which act as general guides to behavior, the standards by which particular actions are judged to be good or desirable (Halstead & Taylor, 2000, p. 169). • Dispositions are guided by beliefs and attitudes related to values such as caring, fairness, honesty, responsibility, and social justice. Examples of dispositions include fairness, being democratic, empathy, enthusiasm, thoughtfulness, and respectfulness. 6
  • 7. Steps in Constructing Non-Cognitive Measures • Decide what information should be sought – (1) No scales are available to measure such construct – (2) All scales are foreign and it is not suitable for the stakeholders or sample that will take the measure – (3) Existing measures are not appropriate for the purpose of assessment – (4) The test developer intends to explore the underlying factors of a construct and eventually confirm it • Search for Content Domain: – Search for relevant literature reviews – Look for the appropriate definition – Explain the theory – Specify the underlying variables (deconstruction) 7
  • 8. Steps in Constructing Non-Cognitive Measures Subscale 1 Subscale 2 Subscale 3 Factor Subscale 4 Subscale 5 8
  • 9. Steps in Constructing Non-Cognitive Measures item 1 Factor1 Item 2 item 3 item 4 Factor2 item 5 9
  • 10. Steps in Constructing Non-Cognitive Measures • Write the first draft of items: • Items are created for each subscale as guided by the conceptual definition. • The number of items as planned in the Table of Specifications is also considered. • As much as possible, a large number of items are written to represent well the behavior being measured. • How to write Items: – Items are based on the definition of the subscales – Provide the manifestation of the construct – Descriptions from references – Conduct an open ended surveys, interview, FGD’s 10
  • 11. Steps in Constructing Non-Cognitive Measures Good questionnaire items should: 1. Include a vocabulary that is simple, direct, and familiar to all respondents 2. Be clear and specific 3. Not involve leading, loaded or double barreled questions 4. Be as short as possible 5. Include all conditional information prior to the key ideas 6. Be edited for readability 7. Generalizable for a large sample. 8. Avoid time-bound situations. 11
  • 12. Steps in Constructing Non-Cognitive Measures • Example of bad items: • I am satisfied with my wages and hours at the place where I work. (Double Barreled) • I not in favor congress passing a law not allowing any employer to force any employee to retire • at any age. (Double Negative) • Most people favor death penalty. What do you think? (Leading Question) 12
  • 13. Steps in Constructing Non-Cognitive Measures • Select a scaling technique: • After writing the items, the test developer decides on the appropriate response format to be used in the scale. • The most common response formats used: – Lickert scale (measure of position in an opinion) – Verbal frequency scale (measure of a habit) – Ordinal scale (ordering of responses) – Linear numeric scale (judging a single dimension in an array) • Develop directions for responding: • Directions or instructions for the target respondents be created as early as when the items are created. • Clear and concise. • Respondents should be informed how to answer. • When you intend to have a separate answer sheet, make sure to inform the respondents about it in the instructions. • Instructions should also include ways of changing answers, how to answer (encircle, check, or shade). • Inform the respondents in the instructions specifically what they need to do. 13
  • 14. Steps in Constructing Non-Cognitive Measures • Conduct a judgmental review of items • Have experts review your items. 14
  • 15. Steps in Constructing Non-Cognitive Measures • Reexamine and revise the questionnaire • Prepare a draft and gather preliminary pilot data: • Requires a layout of the test for the respondents. • Make the scale as easy as possible to use. • Each item can be identified with a number or a letter to facilitate scoring of responses later. • The items should be structured for readability and recording responses. • Whenever possible items with the same response formats are placed together. • In designing self-administered scales, it is suggested to make it visually appealing to increase response rate. • Self-explanatory and the respondents can complete it in a short time. • Ordering of items: The first few questions set the tone for the rest of the items and determine how willingly and 15 conscientiously respondents will work on subsequent questions.
  • 16. Steps in Constructing Non-Cognitive Measures • Analyze Pilot data: • The responses in the scale should be recorded using a spreadsheet. • The numerical responses are then analyzed. • The analysis consists of determining whether the test is reliable or valid. • Revise the Instrument: • The instrument is then revised because items with low factor loadings are removed • Items when removed will increase Cronbach’s alpha. 16
  • 17. Steps in Constructing Non-Cognitive Measures • Gather final pilot data • A large sample is again selected which is three times the number of items. • Conduct Additional Validity and Reliability Analysis • The validity and reliability is again analyzed using the new pilot data. • Edit the questionnaire and specify the procedures for its use • Items with low factor loadings are again removed resulting to less items. • A new form of the test with reduced items will be formed. • Prepare the Test Manual • The test manual indicates the purpose of the test, instructions in administering, procedure for scoring, interpreting the scores 17 including the norms.
  • 18. Examples of Response Formats • Multiple response item: • Single-response item 18
  • 19. Examples of Response Formats • Lickert Scale 19
  • 20. Examples of Response Formats • Verbal frequency scale 20
  • 21. Examples of Response Formats • Ordinal scale 21
  • 22. Examples of Response Formats • Forced Ranking Scale • Paired Comparison Scale 22
  • 23. Examples of Response Formats • Comparative Scale 23
  • 24. Examples of Response Formats • Linear Numeric Scale 24
  • 25. Examples of Response Formats • Semantic differential scale 25
  • 26. Examples of Response Formats • Adjective checklist • Semantic distance scale 26
  • 27. Examples of Response Formats • Fixed sum scale 27
  • 28. Examples of Response Formats • Multiple rating list 28
  • 29. Examples of Response Formats • Multiple rating matrix 29
  • 30. Examples of Response Formats • Diagram scale 30
  • 31. Examples of Response Formats • Graphic scale 31