SlideShare a Scribd company logo
Scale Dimensionality Dr. Carlo Magno De La Salle University
 
Measuring Achievement
Unidimensional Scaling Aimed at selecting a set of data items that can be empirically demonstrated to correspond to a single social-psychological dimension (Gordon, 1977). Methods: Thurstone or Equal-Appearing Interval Scaling  Likert or "Summative" Scaling Guttman or "Cumulative" Scaling.
Multidimensional Scaling There is more than a single dimension that underlies a set of observations. Aside from measuring the degree of the object, it classifies the object according to two or more properties. The trait being measures as a whole is called a  latent construct /variable. The components of the latent are called  factors/subscales/manifest  variables.
Multidimensional Scaling Methods of analyzing multidimensionality: Factor Analysis Exploratory Factor Analysis Principal Component Analysis Joining Tree Clustering K-Means clustering Confirmatory Factor Analysis Structural Equations Modeling (SEM)
Questions? Give examples of unidimensional and multidimensional constructs. How do you know if a construct is unidimensional or multidimensional?
Sandra Bem (1980) – BSRI Aggression Scale Gender Role Identity Masculinity Femininity Aggression Physical Verbal
Ambivalent Sexism Inventory by Fiske and Glick (1996) 2 Factor Theory of Intelligence by Charles Spearman  Sexism Benevolent sexism Hostile sexism Intelligence G Factor S factor
Theory of Intelligence by James McKeen Cattell Myers Briggs Type Indicator Intelligence Crystallized  Intelligence Fluid Intelligence Temperament Extraversion/Introversion Sensing/Intuition Thinking/Feeling Judging/Perceiving
Factor Analysis (1) to reduce the number of variables and (2) to detect structure in the relationships between variables, that is to classify variables. A factor is a set of highly intercorrelated variables. Principal Component Analysis Factor Loading Eigenvalue Factor Rotation Principal Factor Analysis Communalities
Principal Component Analysis Combining variables that are highly correlated with each other. Factor Loadings-the correlations between the factors and the variables. Factor loading of .30 – the variable contributes meaningfully to the factor. Factor loadings of .40 – high contributions to the factor If the the factors and the variables are strongly correlated it means that they are measuring the same thing.
Eigenvalues A measure of how much variance each successive factor extracts. the first factor is generally more highly correlated with the variables than the second factor. This is to be expected because these factors are extracted successively and will account for less and less variance overall. Factor extraction stops when factors begin to yield low eigenvalues.
Eigenvalues Methods of evaluating eigenvalues The Kaiser criterion.  retain only factors with eigenvalues greater than 1. Unless a factor extracts at least as much as the equivalent of one original variable, it is dropped (Kaiser, 1960). The scree test.  A graphical method where the eigenvalues are placed in a line plot. The place where the smooth decrease of eigenvalues appears to level off to the right of the plot. To the right of this point, presumably, one finds only "factorial scree" - "scree" is the geological term referring to the debris which collects on the lower part of a rocky slope.
Factor Rotation Maximize the variance (variability) of the "new" variable (factor), while minimizing the variance around the new variable.  Factor loadings are plotted in a scatterplot. In that plot, each variable is represented as a point. In this plot we could rotate (clockwise/ counterclockwise) the axes in any direction without changing the relative locations of the points to each other; however, the actual coordinates of the points, that is, the factor loadings would of course change. To better position the factor axes through, at least nearer to, clusters of highly correlated variables.
Why are factors rotated? Factors are redefined in a second phase factor analysis in order to achieve a simpler factor structure that is interpretable.  Factor rotation enables the researcher to meet conditions such as: Each variable loads strongly on one and only one factor Each factor shows two or more strong loadings Most loadings are high or low, with few intermediate value
Methods of Factor Rotation Orthogonal Rotation  Varimax Rotation  Quartimax Equamax Oblique Rotation
Unrotated Factors
Orthogonal Rotation
Orthogonal Rotation The perpendicularity of factor axes is maintained. The rotated factors explains the same total variance as do the unrotated factors.  The proportion of variance of individual variables that is explained changes (change in communalities).
Types of Orthogonal Rotation  Varimax rotation – Maximizes the variance of the squared factor loadings associated with each factor. Maximizes the factor loadings for each column. Quartimax Rotation – Minimizing the cross-products of the factor loadings. Maximizes the factor loadings for each item. Equimax rotation – compromise between varimax and equimax. Enhance interpretability between factors and items (variables).
Oblique Rotation
Oblique Rotation Provides greater flexibility in positioning factor axes by relaxing the requirement that the factors be orthogonal. Axes are positioned directly through clusters of highly correlated variables. Create factors that more strongly represent clusters of highly correlated variables than do orthogonal rotation.
What rotation should be used? Orthogonal Oblique Purpose: To create a reduced set of orthogonal factor variates that will be entered into other analyses.  If the loadings produced are interpretable then stick to simpler solutions. If the FA is being used to uncover the factor structure underlying a set of variables. If there is sound theoretical reasons to expect high correlations
Reading for Factor Rotation Deikohoff, G. (1992). Statistics for the social and behavioral sciences: Univariate, bivariate and multivariate. Dubuque, IA: Wm. C. Brown Publishers.
Principal Factor Analysis Also known as Principle Axis Functioning  The factors does not extract all variances in the items (presence of scree) Communalities - proportion of variance of a particular item that is due to common factors (shared with other items) is called communality.
What is the difference between PCA and PFA? PCA each squared factor loading gives the proportion of variance in an original variable that is explained by a given factor variate. p  x 1.0 Communalities are sum of squared loadings for given variable across factors. Proportion of variance in a variable which is explained by the set of extracted factor variates.
Exploratory Factor Analysis Absence of prior hypothesis regarding the factor structure underlying a battery of attitudes (Tucker & McCallum, 1997). Used in the early stages of research on developing a concept. Hypothesis will be at best loosely defined, the general objective of the research is to explore the factorial structure of the domain.
Exploratory Factor Analysis Analysis: Involves estimation of all parameters of the model (common factor weights, intercorrelations, and unique variances) Use Provides aid for the researchers to determine the number of factors, interpreting the nature of those factors, and refining the battery of attributes of the for the purpose of further study of the domain.
Exploratory Factor Analysis Techniques Tree Analysis K-means cluster analysis Principal component analysis Multidimensional scaling
Procedure Identify the items that will be analyzed through EFA Calculate the correlation matrix Examine the matrix Inverse of the matrix Bartlett Test Anti-image covariance matrix Kaiser-Meyer-Olkin-Criteria (KMO) Show the factor loadings
Procedure Decide on the number of factors to be extracted using Scree –test and elbow method Interpret the factors extracted
Statistical Output The Kaiser-Meyer-Olkin (KMO)- measure of sampling adequacy tests whether the partial correlations among variables are small.  Bartlett's test of sphericity -tests whether the correlation matrix is an identity matrix, which would indicate that the factor model is inappropriate.  Anti-image- Contains the negatives of the partial correlation coefficients, and the anti-image covariance matrix contains the negatives of the partial covariances. In a good factor model, most of the off-diagonal elements will be small. The measure of sampling adequacy for a variable is displayed on the diagonal of the anti-image correlation matrix.
Confirmatory Factor Analysis There is a developed and specific hypothesis about the factorial structure of a battery of attributes. The hypothesis concerns the number of common factors, their pattern of intercorrelation, and pattern of common factor weights. Used to indicate how well a set of data fits the hypothesized structure. The structure is hypothesized in advance.  Follow-up to a standard factor analysis
Confirmatory Factor Analysis Allows the investigator to fit common factor model to observed data under various types of constraints. Analysis: The parameters of the model is estimated, and the goodness of fit of the solution to the data is evaluated. The degree to which the solution fit the data would provide evidence for or against the prior hypothesis.
Confirmatory Factor Analysis A solution which fit well would lend support for the hypothesis and provide evidence for construct validity of the attributes and the hypothesized factorial structure of the domain as represented by the battery of attributes. CFA is integrated in Structural Equation Modeling (SEM), helping create the latent variables modeled by SEM.  It is done to validate a scale or index by demonstrating that its constituent items load on the same factor, and to drop proposed scale items which cross-load on more than one factor.
Approaches in CFA Traditional Approach  - Uses principle axis factoring (PAF/PFA). This method allows the researcher to examine factor loadings of indicator variables to determine if they load on latent variables (factors) as predicted by the researcher's model.  SEM Approach  – Model building where the covariance of every pair of factors are determine with their corresponding errors. It tests whether factors are significant components of the latent construct.
Goodness of Fit Noncentrality Interval Estimation Single Sample Goodness of fit Index
Noncentrality Interval Estimation Represents a change of emphasis in assessing model fit. Instead of testing the hypothesis that the fit is perfect, we ask the questions (a) "How bad is the fit of our model to our statistical population?" and (b) "How accurately have we determined population badness-of-fit from our sample data."
Noncentrality Indices Steiger-Lind RMSEA  -compensates for model parsimony by dividing the estimate of the population noncentrality parameter by the degrees of freedom. This ratio, in a sense, represents a "mean square badness-of-fit."  Values of the RMSEA index below .05 indicate good fit, and values below .01 indicate outstanding fit
Noncentrality Indices McDonald's Index of Noncentrality -The index represents one approach to transforming the population noncentrality index  F*  into the range from 0 to 1. Good fit is indicated by values above .95.
Noncentrality Indices The Population Gamma Index - an estimate of the "population GFI," the value of the GFI that would be obtained if we could analyze the population covariance matrix   . For this index, good fit is indicated by values above .95.
Noncentrality Indices Adjusted Population Gamma Index (Joreskog AGFI)  - estimate of the population GFI corrected for model parsimony. Good fit is indicated by values above .95.
Single Sample Goodness of fit Index Joreskog GFI.  Values above .95 indicate good fit. This index is a negatively biased estimate of the population GFI, so it tends to produce a slightly pessimistic view of the quality of population fit. Joreskog AGFI.   Values above .95 indicate good fit. This index is, like the GFI, a negatively biased estimate of its population equivalent.
Single Sample Goodness of fit Index Akaike Information Criterion.  This criterion is useful primarily for deciding which of several nested models provides the best approximation to the data. When trying to decide between several nested models, choose the one with the smallest Akaike criterion.  Schwarz's Bayesian Criterion.  This criterion, like the Akaike, is used for deciding among several models in a nested sequence. When deciding among several nested models, choose the one with the smallest Schwarz criterion value.
Browne-Cudeck Cross Validation Index.  Browne and Cudeck (1989) proposed a single sample cross-validation index as a follow-up to their earlier (Cudeck & Browne,1983). It requires two samples, i.e., the calibration sample for fitting the models, and the cross-validation sample.  Independence Model Chi-square and df.  These are the  Chi -square goodness-of-fit statistic, and associated degrees of freedom, for the hypothesis that the population covariances are all zero. Single Sample Goodness of fit Index
Bentler-Bonett (1980) Normed Fit Index.  measures the relative decrease in the discrepancy function caused by switching from a "Null Model" or baseline model, to a more complex model. This index approaches 1 in value as fit becomes perfect. However, it does not compensate for model parsimony. Bentler-Bonett Non-Normed Fit Index.  This comparative index takes into account model parsimony. Bentler Comparative Fit Index.  This comparative index estimates the relative decrease in population noncentrality obtained by changing from the "Null Model" to the k'th model.  Single Sample Goodness of fit Index
James-Mulaik-Brett Parsimonious Fit Index.  Compensate for model parsimony. Basically, it operates by rescaling the Bentler-Bonnet Normed fit index to compensate for model parsimony. Bollen's Rho.  This comparative fit index computes the relative reduction in the discrepancy function per degree of freedom when moving from the "Null Model" to the k'th model. Bollen's Delta.  This index is similar in form to the Bentler-Bonnet index, but rewards simpler models (those with higher degrees of freedom). Single Sample Goodness of fit Index
Reference for the Goodness of fit for CFA StatSoft, Inc.  (2005).  STATISTICA electronic manual.   Tulsa OK: Author.
Issue? When does a study call for an exploratory of confirmatory factor analysis? If you wish to restrict the number of factors extracted to a particular number and specify particular patterns of relationship between measured variables and common factors, and this is done a priori (before seeing the data), then the confirmatory procedure is for you. If you have no such well specified a priori restrictions , then use the exploratory procedure.

More Related Content

PPTX
Chap 4 biomechanics of ligaments
PPTX
Distal biceps tendon rupture - by Hussain Algawahmed
PPTX
Journal club schatzker - Copy.pptx
PPTX
gait analysers
PDF
Clinical Applications Of Low Level Laser Therapy In Physical Therapy
PPTX
Therapeutic ultrasound
PPTX
Biomechanics of the hip and knee joint
Chap 4 biomechanics of ligaments
Distal biceps tendon rupture - by Hussain Algawahmed
Journal club schatzker - Copy.pptx
gait analysers
Clinical Applications Of Low Level Laser Therapy In Physical Therapy
Therapeutic ultrasound
Biomechanics of the hip and knee joint

What's hot (8)

PPTX
Interferential therapy
PPTX
Hand anatomy- harsh amin
PPT
gastrocnemius flap
PPTX
Top 10 estee lauder cover letter samples
PPTX
Biomechanics
PPT
Effect Of Training On The Anaerobic Energy System
PPT
Five Most Common Running Injuries
PPTX
VOLAR APPROACH TO WRIST.pptx
Interferential therapy
Hand anatomy- harsh amin
gastrocnemius flap
Top 10 estee lauder cover letter samples
Biomechanics
Effect Of Training On The Anaerobic Energy System
Five Most Common Running Injuries
VOLAR APPROACH TO WRIST.pptx
Ad

Similar to Factor anaysis scale dimensionality (20)

PPTX
PPTX
Factor analysis
PPTX
Factor Analysis from sets of measures.pptx
PPTX
Priya
PPTX
Factor analysis (1)
PPTX
Factor Analysis of MPH Biostatistics.pptx
PPTX
Marketing Research-Factor Analysis
PDF
Factor Analysis - Part 2 By Vikramjit Singh
PPTX
Factor analysis ppt
PPTX
An Introduction to Factor analysis ppt
PDF
factor-analysis (1).pdf
PPTX
9. Factor Analysis_JASP.pptx..................................
PPTX
Factor Analysis in Research
PPTX
08 - FACTOR ANALYSIS PPT.pptx
PDF
Factor Analysis - Statistics
PPTX
Factor analysis
PDF
Multinomial Logistic Regression.pdf
PDF
Factor Analysis
PDF
Unit-3 Data Analytics.pdf
PDF
Overview Of Factor Analysis Q Ti A
Factor analysis
Factor Analysis from sets of measures.pptx
Priya
Factor analysis (1)
Factor Analysis of MPH Biostatistics.pptx
Marketing Research-Factor Analysis
Factor Analysis - Part 2 By Vikramjit Singh
Factor analysis ppt
An Introduction to Factor analysis ppt
factor-analysis (1).pdf
9. Factor Analysis_JASP.pptx..................................
Factor Analysis in Research
08 - FACTOR ANALYSIS PPT.pptx
Factor Analysis - Statistics
Factor analysis
Multinomial Logistic Regression.pdf
Factor Analysis
Unit-3 Data Analytics.pdf
Overview Of Factor Analysis Q Ti A
Ad

More from Carlo Magno (20)

PPTX
Intervetntions-based assessment - supevisors - private schools.pptx
PPTX
Assessment Using the SOLO Framework.pptx
PPTX
Social and Emotional Learning
PPTX
Educational assessment in the 4 ir
PPTX
The process of research mentoring
PPTX
Quality management services sustainability training
PPTX
Managing technology integration in schools
PPTX
Integrating technology in teaching
PPTX
Empowering educators on technology integration
PPTX
Designing an online lesson
PPTX
Curriculum integration
PPTX
Accountability in Developing Student Learning
PPTX
The Instructional leader: TOwards School Improvement
PPTX
Guiding your child on their career decision making
PPTX
Assessing Science Inquiry Skills
PPTX
Assessment in the Social Studies Curriculum
PPTX
Quantitative analysis in language research
PPTX
Integrating technology in teaching
PPTX
Hallmarks of textbook
PDF
managing the learner centered-classroom
Intervetntions-based assessment - supevisors - private schools.pptx
Assessment Using the SOLO Framework.pptx
Social and Emotional Learning
Educational assessment in the 4 ir
The process of research mentoring
Quality management services sustainability training
Managing technology integration in schools
Integrating technology in teaching
Empowering educators on technology integration
Designing an online lesson
Curriculum integration
Accountability in Developing Student Learning
The Instructional leader: TOwards School Improvement
Guiding your child on their career decision making
Assessing Science Inquiry Skills
Assessment in the Social Studies Curriculum
Quantitative analysis in language research
Integrating technology in teaching
Hallmarks of textbook
managing the learner centered-classroom

Recently uploaded (20)

PDF
3rd Neelam Sanjeevareddy Memorial Lecture.pdf
PDF
01-Introduction-to-Information-Management.pdf
PDF
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
PDF
RMMM.pdf make it easy to upload and study
PDF
Insiders guide to clinical Medicine.pdf
PPTX
Renaissance Architecture: A Journey from Faith to Humanism
PPTX
Cell Structure & Organelles in detailed.
PPTX
school management -TNTEU- B.Ed., Semester II Unit 1.pptx
PPTX
Week 4 Term 3 Study Techniques revisited.pptx
PPTX
Final Presentation General Medicine 03-08-2024.pptx
PDF
TR - Agricultural Crops Production NC III.pdf
PPTX
Introduction to Child Health Nursing – Unit I | Child Health Nursing I | B.Sc...
PPTX
Introduction_to_Human_Anatomy_and_Physiology_for_B.Pharm.pptx
PDF
Module 4: Burden of Disease Tutorial Slides S2 2025
PDF
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
PPTX
PPT- ENG7_QUARTER1_LESSON1_WEEK1. IMAGERY -DESCRIPTIONS pptx.pptx
PDF
Anesthesia in Laparoscopic Surgery in India
PDF
VCE English Exam - Section C Student Revision Booklet
PDF
FourierSeries-QuestionsWithAnswers(Part-A).pdf
PDF
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
3rd Neelam Sanjeevareddy Memorial Lecture.pdf
01-Introduction-to-Information-Management.pdf
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
RMMM.pdf make it easy to upload and study
Insiders guide to clinical Medicine.pdf
Renaissance Architecture: A Journey from Faith to Humanism
Cell Structure & Organelles in detailed.
school management -TNTEU- B.Ed., Semester II Unit 1.pptx
Week 4 Term 3 Study Techniques revisited.pptx
Final Presentation General Medicine 03-08-2024.pptx
TR - Agricultural Crops Production NC III.pdf
Introduction to Child Health Nursing – Unit I | Child Health Nursing I | B.Sc...
Introduction_to_Human_Anatomy_and_Physiology_for_B.Pharm.pptx
Module 4: Burden of Disease Tutorial Slides S2 2025
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
PPT- ENG7_QUARTER1_LESSON1_WEEK1. IMAGERY -DESCRIPTIONS pptx.pptx
Anesthesia in Laparoscopic Surgery in India
VCE English Exam - Section C Student Revision Booklet
FourierSeries-QuestionsWithAnswers(Part-A).pdf
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf

Factor anaysis scale dimensionality

  • 1. Scale Dimensionality Dr. Carlo Magno De La Salle University
  • 2.  
  • 4. Unidimensional Scaling Aimed at selecting a set of data items that can be empirically demonstrated to correspond to a single social-psychological dimension (Gordon, 1977). Methods: Thurstone or Equal-Appearing Interval Scaling Likert or "Summative" Scaling Guttman or "Cumulative" Scaling.
  • 5. Multidimensional Scaling There is more than a single dimension that underlies a set of observations. Aside from measuring the degree of the object, it classifies the object according to two or more properties. The trait being measures as a whole is called a latent construct /variable. The components of the latent are called factors/subscales/manifest variables.
  • 6. Multidimensional Scaling Methods of analyzing multidimensionality: Factor Analysis Exploratory Factor Analysis Principal Component Analysis Joining Tree Clustering K-Means clustering Confirmatory Factor Analysis Structural Equations Modeling (SEM)
  • 7. Questions? Give examples of unidimensional and multidimensional constructs. How do you know if a construct is unidimensional or multidimensional?
  • 8. Sandra Bem (1980) – BSRI Aggression Scale Gender Role Identity Masculinity Femininity Aggression Physical Verbal
  • 9. Ambivalent Sexism Inventory by Fiske and Glick (1996) 2 Factor Theory of Intelligence by Charles Spearman Sexism Benevolent sexism Hostile sexism Intelligence G Factor S factor
  • 10. Theory of Intelligence by James McKeen Cattell Myers Briggs Type Indicator Intelligence Crystallized Intelligence Fluid Intelligence Temperament Extraversion/Introversion Sensing/Intuition Thinking/Feeling Judging/Perceiving
  • 11. Factor Analysis (1) to reduce the number of variables and (2) to detect structure in the relationships between variables, that is to classify variables. A factor is a set of highly intercorrelated variables. Principal Component Analysis Factor Loading Eigenvalue Factor Rotation Principal Factor Analysis Communalities
  • 12. Principal Component Analysis Combining variables that are highly correlated with each other. Factor Loadings-the correlations between the factors and the variables. Factor loading of .30 – the variable contributes meaningfully to the factor. Factor loadings of .40 – high contributions to the factor If the the factors and the variables are strongly correlated it means that they are measuring the same thing.
  • 13. Eigenvalues A measure of how much variance each successive factor extracts. the first factor is generally more highly correlated with the variables than the second factor. This is to be expected because these factors are extracted successively and will account for less and less variance overall. Factor extraction stops when factors begin to yield low eigenvalues.
  • 14. Eigenvalues Methods of evaluating eigenvalues The Kaiser criterion. retain only factors with eigenvalues greater than 1. Unless a factor extracts at least as much as the equivalent of one original variable, it is dropped (Kaiser, 1960). The scree test. A graphical method where the eigenvalues are placed in a line plot. The place where the smooth decrease of eigenvalues appears to level off to the right of the plot. To the right of this point, presumably, one finds only "factorial scree" - "scree" is the geological term referring to the debris which collects on the lower part of a rocky slope.
  • 15. Factor Rotation Maximize the variance (variability) of the "new" variable (factor), while minimizing the variance around the new variable. Factor loadings are plotted in a scatterplot. In that plot, each variable is represented as a point. In this plot we could rotate (clockwise/ counterclockwise) the axes in any direction without changing the relative locations of the points to each other; however, the actual coordinates of the points, that is, the factor loadings would of course change. To better position the factor axes through, at least nearer to, clusters of highly correlated variables.
  • 16. Why are factors rotated? Factors are redefined in a second phase factor analysis in order to achieve a simpler factor structure that is interpretable. Factor rotation enables the researcher to meet conditions such as: Each variable loads strongly on one and only one factor Each factor shows two or more strong loadings Most loadings are high or low, with few intermediate value
  • 17. Methods of Factor Rotation Orthogonal Rotation Varimax Rotation Quartimax Equamax Oblique Rotation
  • 20. Orthogonal Rotation The perpendicularity of factor axes is maintained. The rotated factors explains the same total variance as do the unrotated factors. The proportion of variance of individual variables that is explained changes (change in communalities).
  • 21. Types of Orthogonal Rotation Varimax rotation – Maximizes the variance of the squared factor loadings associated with each factor. Maximizes the factor loadings for each column. Quartimax Rotation – Minimizing the cross-products of the factor loadings. Maximizes the factor loadings for each item. Equimax rotation – compromise between varimax and equimax. Enhance interpretability between factors and items (variables).
  • 23. Oblique Rotation Provides greater flexibility in positioning factor axes by relaxing the requirement that the factors be orthogonal. Axes are positioned directly through clusters of highly correlated variables. Create factors that more strongly represent clusters of highly correlated variables than do orthogonal rotation.
  • 24. What rotation should be used? Orthogonal Oblique Purpose: To create a reduced set of orthogonal factor variates that will be entered into other analyses. If the loadings produced are interpretable then stick to simpler solutions. If the FA is being used to uncover the factor structure underlying a set of variables. If there is sound theoretical reasons to expect high correlations
  • 25. Reading for Factor Rotation Deikohoff, G. (1992). Statistics for the social and behavioral sciences: Univariate, bivariate and multivariate. Dubuque, IA: Wm. C. Brown Publishers.
  • 26. Principal Factor Analysis Also known as Principle Axis Functioning The factors does not extract all variances in the items (presence of scree) Communalities - proportion of variance of a particular item that is due to common factors (shared with other items) is called communality.
  • 27. What is the difference between PCA and PFA? PCA each squared factor loading gives the proportion of variance in an original variable that is explained by a given factor variate. p x 1.0 Communalities are sum of squared loadings for given variable across factors. Proportion of variance in a variable which is explained by the set of extracted factor variates.
  • 28. Exploratory Factor Analysis Absence of prior hypothesis regarding the factor structure underlying a battery of attitudes (Tucker & McCallum, 1997). Used in the early stages of research on developing a concept. Hypothesis will be at best loosely defined, the general objective of the research is to explore the factorial structure of the domain.
  • 29. Exploratory Factor Analysis Analysis: Involves estimation of all parameters of the model (common factor weights, intercorrelations, and unique variances) Use Provides aid for the researchers to determine the number of factors, interpreting the nature of those factors, and refining the battery of attributes of the for the purpose of further study of the domain.
  • 30. Exploratory Factor Analysis Techniques Tree Analysis K-means cluster analysis Principal component analysis Multidimensional scaling
  • 31. Procedure Identify the items that will be analyzed through EFA Calculate the correlation matrix Examine the matrix Inverse of the matrix Bartlett Test Anti-image covariance matrix Kaiser-Meyer-Olkin-Criteria (KMO) Show the factor loadings
  • 32. Procedure Decide on the number of factors to be extracted using Scree –test and elbow method Interpret the factors extracted
  • 33. Statistical Output The Kaiser-Meyer-Olkin (KMO)- measure of sampling adequacy tests whether the partial correlations among variables are small. Bartlett's test of sphericity -tests whether the correlation matrix is an identity matrix, which would indicate that the factor model is inappropriate. Anti-image- Contains the negatives of the partial correlation coefficients, and the anti-image covariance matrix contains the negatives of the partial covariances. In a good factor model, most of the off-diagonal elements will be small. The measure of sampling adequacy for a variable is displayed on the diagonal of the anti-image correlation matrix.
  • 34. Confirmatory Factor Analysis There is a developed and specific hypothesis about the factorial structure of a battery of attributes. The hypothesis concerns the number of common factors, their pattern of intercorrelation, and pattern of common factor weights. Used to indicate how well a set of data fits the hypothesized structure. The structure is hypothesized in advance. Follow-up to a standard factor analysis
  • 35. Confirmatory Factor Analysis Allows the investigator to fit common factor model to observed data under various types of constraints. Analysis: The parameters of the model is estimated, and the goodness of fit of the solution to the data is evaluated. The degree to which the solution fit the data would provide evidence for or against the prior hypothesis.
  • 36. Confirmatory Factor Analysis A solution which fit well would lend support for the hypothesis and provide evidence for construct validity of the attributes and the hypothesized factorial structure of the domain as represented by the battery of attributes. CFA is integrated in Structural Equation Modeling (SEM), helping create the latent variables modeled by SEM. It is done to validate a scale or index by demonstrating that its constituent items load on the same factor, and to drop proposed scale items which cross-load on more than one factor.
  • 37. Approaches in CFA Traditional Approach - Uses principle axis factoring (PAF/PFA). This method allows the researcher to examine factor loadings of indicator variables to determine if they load on latent variables (factors) as predicted by the researcher's model. SEM Approach – Model building where the covariance of every pair of factors are determine with their corresponding errors. It tests whether factors are significant components of the latent construct.
  • 38. Goodness of Fit Noncentrality Interval Estimation Single Sample Goodness of fit Index
  • 39. Noncentrality Interval Estimation Represents a change of emphasis in assessing model fit. Instead of testing the hypothesis that the fit is perfect, we ask the questions (a) "How bad is the fit of our model to our statistical population?" and (b) "How accurately have we determined population badness-of-fit from our sample data."
  • 40. Noncentrality Indices Steiger-Lind RMSEA -compensates for model parsimony by dividing the estimate of the population noncentrality parameter by the degrees of freedom. This ratio, in a sense, represents a "mean square badness-of-fit." Values of the RMSEA index below .05 indicate good fit, and values below .01 indicate outstanding fit
  • 41. Noncentrality Indices McDonald's Index of Noncentrality -The index represents one approach to transforming the population noncentrality index F* into the range from 0 to 1. Good fit is indicated by values above .95.
  • 42. Noncentrality Indices The Population Gamma Index - an estimate of the "population GFI," the value of the GFI that would be obtained if we could analyze the population covariance matrix  . For this index, good fit is indicated by values above .95.
  • 43. Noncentrality Indices Adjusted Population Gamma Index (Joreskog AGFI) - estimate of the population GFI corrected for model parsimony. Good fit is indicated by values above .95.
  • 44. Single Sample Goodness of fit Index Joreskog GFI. Values above .95 indicate good fit. This index is a negatively biased estimate of the population GFI, so it tends to produce a slightly pessimistic view of the quality of population fit. Joreskog AGFI.   Values above .95 indicate good fit. This index is, like the GFI, a negatively biased estimate of its population equivalent.
  • 45. Single Sample Goodness of fit Index Akaike Information Criterion. This criterion is useful primarily for deciding which of several nested models provides the best approximation to the data. When trying to decide between several nested models, choose the one with the smallest Akaike criterion. Schwarz's Bayesian Criterion. This criterion, like the Akaike, is used for deciding among several models in a nested sequence. When deciding among several nested models, choose the one with the smallest Schwarz criterion value.
  • 46. Browne-Cudeck Cross Validation Index. Browne and Cudeck (1989) proposed a single sample cross-validation index as a follow-up to their earlier (Cudeck & Browne,1983). It requires two samples, i.e., the calibration sample for fitting the models, and the cross-validation sample. Independence Model Chi-square and df. These are the Chi -square goodness-of-fit statistic, and associated degrees of freedom, for the hypothesis that the population covariances are all zero. Single Sample Goodness of fit Index
  • 47. Bentler-Bonett (1980) Normed Fit Index. measures the relative decrease in the discrepancy function caused by switching from a "Null Model" or baseline model, to a more complex model. This index approaches 1 in value as fit becomes perfect. However, it does not compensate for model parsimony. Bentler-Bonett Non-Normed Fit Index. This comparative index takes into account model parsimony. Bentler Comparative Fit Index. This comparative index estimates the relative decrease in population noncentrality obtained by changing from the "Null Model" to the k'th model. Single Sample Goodness of fit Index
  • 48. James-Mulaik-Brett Parsimonious Fit Index. Compensate for model parsimony. Basically, it operates by rescaling the Bentler-Bonnet Normed fit index to compensate for model parsimony. Bollen's Rho. This comparative fit index computes the relative reduction in the discrepancy function per degree of freedom when moving from the "Null Model" to the k'th model. Bollen's Delta. This index is similar in form to the Bentler-Bonnet index, but rewards simpler models (those with higher degrees of freedom). Single Sample Goodness of fit Index
  • 49. Reference for the Goodness of fit for CFA StatSoft, Inc. (2005). STATISTICA electronic manual. Tulsa OK: Author.
  • 50. Issue? When does a study call for an exploratory of confirmatory factor analysis? If you wish to restrict the number of factors extracted to a particular number and specify particular patterns of relationship between measured variables and common factors, and this is done a priori (before seeing the data), then the confirmatory procedure is for you. If you have no such well specified a priori restrictions , then use the exploratory procedure.