SlideShare a Scribd company logo
CORRELATION
Avjinder Singh Kaler and Kristi Mai
 Linear correlation coefficient, r, is a number that measures how well
paired sample data fit a straight-line pattern when graphed.
 Using paired sample data (sometimes called bivariate data), we find
the value of r (usually using technology), then we use that value to
conclude that there is (or is not) a linear correlation between the two
variables.
 In this section we will consider only linear relationships, which means
that when graphed, the points approximate a straight-line pattern.
 We will discuss methods of hypothesis testing for correlation.
Correlation – a correlation exists between two variables when the
values of one variable are somehow associated with the values of the
other variable.
• Can be positive, negative, non-existent, or non-linear
• A linear correlation exists between two variables when there is a
correlation and the plotted points of paired data result in a pattern that
can be approximated by a straight line.
We can often see a relationship between two variables by
constructing a scatterplot.
Scatter plots of paired data
Correlation in Statistics
1. The sample of paired data is a Simple Random Sample of
quantitative data
2. The pairs of data ( 𝑥,𝑦) have a bivariate normal distribution, meaning
the following:
• Visual examination of the scatter plot(s) confirms that the sample points
follow an approximately straight line(s)
• Because results can be strongly affected by the presence of outliers, any
outliers should be removed if they are known to be errors (Note: Use caution
when removing data points)
Note: These are the same as the Requirements for Simple Linear
Regression.
• Linear Correlation Coefficient (𝑟) – measures the strength of the linear
correlation between the paired quantitative 𝑥 and 𝑦 values in a sample
• Also known as the Pearson Product Moment Correlation Coefficient in honor of
Karl Pearson
• This is a Sample Statistic of the correlation that is linear between 𝑥 and 𝑦
• If this value is squared, the value is the Coefficient of Determination ( 𝑟2)
• Notation:
• 𝑟 : linear correlation coefficient for sample data
• 𝜌 : linear correlation coefficient for a population of paired data
• Formula for calculating 𝑟: 𝑟 =
𝑛 Σ𝑥𝑦 − Σ𝑥 ∗(Σ𝑦)
𝑛(Σ𝑥2)− Σ𝑥 2∗ 𝑛(Σ𝑦2)− Σ𝑦 2
1. – 1 ≤ r ≤ 1
2. If all values of either variable are converted to a different scale, the
value of r does not change.
3. The value of r is not affected by the choice of x and y.
4. r measures strength of a linear relationship.
5. r is very sensitive to outliers
• A single outlier can dramatically affect the value of r
• The value of 𝑟2 is the proportion of the variation in 𝑦 that is explained by
the linear relationship that exists between 𝑥 and 𝑦
• Thus, 𝑟2 is, also, the amount of variation in 𝑦 that is explained by the
regression line itself
• We may use 𝑟2 to describe the predictive power of the regression
equation
• To conclude that correlation implies causality
• Using data based on averages
• This type of data causes an inflated correlation coefficient
• To conclude that if there is no linear correlation, there is no correlation
at all
Hypotheses:
𝐻0: 𝜌 = 0
𝐻1: 𝜌 ≠ 0
(𝑛𝑜 𝑙𝑖𝑛𝑒𝑎𝑟 𝑐𝑜𝑟𝑟𝑒𝑙𝑎𝑡𝑖𝑜𝑛)
(𝑙𝑖𝑛𝑒𝑎𝑟 𝑐𝑜𝑟𝑟𝑒𝑙𝑎𝑡𝑖𝑜𝑛 𝑒𝑥𝑖𝑠𝑡𝑠)
These hypotheses can be equivalently tested with the following
hypotheses:
𝐻0: 𝛽1 = 0
𝐻1: 𝛽1 ≠ 0
(𝑛𝑜 𝑙𝑖𝑛𝑒𝑎𝑟 𝑐𝑜𝑟𝑟𝑒𝑙𝑎𝑡𝑖𝑜𝑛)
(𝑙𝑖𝑛𝑒𝑎𝑟 𝑐𝑜𝑟𝑟𝑒𝑙𝑎𝑡𝑖𝑜𝑛 𝑒𝑥𝑖𝑠𝑡𝑠)
Note: This equivalence will be important for the interpretation of the
technological output.
• Use Critical Value from Table A-6 (this is a simpler approach) and think
of the Linear Correlation Coefficient (𝑟) as a ‘test statistic’
• OR, use the following t-score test statistic with 𝑑𝑓 = 𝑛 − 2
𝑡 =
𝑟
1 − 𝑟2
𝑛 − 2
• This 𝑡 test statistic can be viewed in most technological output
corresponding to the test of significance for the slope in the regression
line (i.e. the second set of equivalent hypotheses listed above)
• If using critical values from Table A-6:
𝐼𝑓 𝑟 > 𝑐𝑟𝑖𝑡𝑖𝑐𝑎𝑙 𝑣𝑎𝑙𝑢𝑒, 𝑟𝑒𝑗𝑒𝑐𝑡 𝐻0
𝐼𝑓 𝑟 ≤ 𝑐𝑟𝑖𝑡𝑖𝑐𝑎𝑙 𝑣𝑎𝑙𝑢𝑒, 𝑓𝑎𝑖𝑙 𝑡𝑜 𝑟𝑒𝑗𝑒𝑐𝑡 𝐻0
This test is obviously a two-tailed hypothesis test based on the alternative
hypothesis. Visualize this by plotting the possible values for 𝑟 on a number line
with a labeled critical region.
• If using the t-score test statistic:
• Use statistical software to calculate the correct p-value that corresponds
with the test statistic. Then, base the conclusion on comparison between
the p-value and 𝛼
One-tailed tests can occur with a claim of a positive linear correlation or a
claim of a negative linear correlation. In such cases, the hypotheses will be as
shown here.
For these one-tailed tests, the P-value method can be used as well.
• Construct a scatter plot and verify that the pattern of the points is
approximately a straight line pattern without outliers
• Assess the linear correlation between two variables of interest and
create a regression equation
• Consider any effects of a pattern over time
• Perform a Residual Analysis:
• Construct a residual plot and verify that there is no pattern (other than a
straight line pattern) and also verify that the residual plot does not become
thicker or thinner
• Use a histogram, normal quantile plot, or Shapiro Wilk test of normality to
confirm that the values of the residuals have a distribution that is
approximately normal
• Measurement Error – could be described as ‘explainable’ outliers
• Nonlinear Associations – ignoring possible nonlinear relationships
• Extrapolation – predicting far beyond the scope of our available
data
The paired shoe / height data from five males are listed below.
Using StatCrunch, find the value of the correlation coefficient r.
Requirement Check:
The data are a simple random sample of quantitative data, the plotted
points appear to roughly approximate a straight-line pattern, and there
are no outliers.
A few technologies are displayed below, used to calculate the
value of r.
We found previously for the shoe and height example that r = 0.591.
With r = 0.591, we get r2 = 0.349.
We conclude that about 34.9% of the variation in height can be
explained by the linear relationship between lengths of shoe prints and
heights.
Conduct a formal hypothesis test of the claim that there is a linear
correlation between the two variables.
Use a 0.05 significance level.
We test the claim:
𝐻0: 𝛽1 = 0
𝐻1: 𝛽1 ≠ 0
(𝑛𝑜 𝑙𝑖𝑛𝑒𝑎𝑟 𝑐𝑜𝑟𝑟𝑒𝑙𝑎𝑡𝑖𝑜𝑛)
(𝑙𝑖𝑛𝑒𝑎𝑟 𝑐𝑜𝑟𝑟𝑒𝑙𝑎𝑡𝑖𝑜𝑛 𝑒𝑥𝑖𝑠𝑡𝑠)
We calculate the test statistic:
Table A-3 shows this test statistic yields a p-value that is greater
than 0.20.
2 2
0.591
1.269
1 1 0.591
2 5 2
r
t
r
n
  
 
 
StatCrunch provides a
P-value of 0.2937.
Because the p-value of 0.2937 is greater than the significance level of 0.05,
we fail to reject the null hypothesis.
We conclude there is not sufficient evidence to support the claim that there
is a linear correlation between shoe print length and heights of males.
With the test statistic, r = 0.591.
The critical values of r = ± 0.878 are found in Table A-6 with n = 5 and α = 0.05.
We fail to reject the null and conclude there is not sufficient evidence to
support the claim that there is a linear correlation between shoe print length
and heights of males.
Use the 5 pairs of shoe print lengths and heights to predict the height of
a person with a shoe print length of 29 cm.
The regression line does not fit the points well. The correlation is r = 0.591,
which suggests there is not a linear correlation (the p-value was 0.2937).
From StatCrunch,
The best predicted height is simply the mean of the sample heights:
177.3 cmy 
Use the 40 pairs of shoe print
lengths from Data Set 2 in
Appendix B to predict the height
of a person with a shoe print
length of 29 cm.
Now, the regression line does fit
the points well, and the
correlation of r = 0.813 suggests
that there is a linear correlation
since the p-value is < 0.0001.
The regression equation and scatterplot are shown below:
The given shoe length of 29 cm is not beyond the scope of the
available data, so substitute in 29 cm into the regression model:
A person with a shoe length of 29 cm is predicted to be 174.3 cm tall.
Using StatCrunch,
 
ˆ 80.9 3.22
80.9 3.22 29
174.3 cm
y x 
 

What if we have two or more explanatory variables?
Do we have a method for this? YES!! Of course, we do.
We may want to predict a sea turtle’s lifespan by more variables than
simply length of shell! I want to use variables that account for the diet,
exercise, mental health, and captivity status of the turtle! What about
variables that also account for the water quality in the turtle’s
surrounding environment? I want that too!!!
To do this, we can use something known as Multiple Linear Regression!

More Related Content

PDF
Simple linear regression
PPTX
Correlation and regression
PPT
Correlation IN STATISTICS
PPT
Regression analysis
PPTX
Spearman rank correlation coefficient
PPTX
Bivariate data
PPTX
Statistics-Regression analysis
PDF
Multiple linear regression
Simple linear regression
Correlation and regression
Correlation IN STATISTICS
Regression analysis
Spearman rank correlation coefficient
Bivariate data
Statistics-Regression analysis
Multiple linear regression

What's hot (20)

PPTX
Multiple Regression Analysis
PPT
Correlation analysis ppt
PPT
Spearman Rank Correlation Presentation
PPTX
Correlation and Regression ppt
PDF
Mpc 006 - 02-03 partial and multiple correlation
PPTX
Regression analysis
PPTX
Regression Analysis
PPTX
Regression analysis
PPTX
Covariance vs Correlation
PPTX
Correlation and regression
PPT
Scatter plots
PPTX
Correlation analysis
PPTX
Logistic regression with SPSS
PPTX
Meaning and types of correlation
PPTX
Regression analysis
PDF
Regression analysis algorithm
PPTX
What is a partial correlation?
PDF
Linear regression theory
PPTX
Multivariate analysis
PPTX
Scatter plot- Complete
Multiple Regression Analysis
Correlation analysis ppt
Spearman Rank Correlation Presentation
Correlation and Regression ppt
Mpc 006 - 02-03 partial and multiple correlation
Regression analysis
Regression Analysis
Regression analysis
Covariance vs Correlation
Correlation and regression
Scatter plots
Correlation analysis
Logistic regression with SPSS
Meaning and types of correlation
Regression analysis
Regression analysis algorithm
What is a partial correlation?
Linear regression theory
Multivariate analysis
Scatter plot- Complete
Ad

Viewers also liked (12)

PPTX
Correlation Statistics
PPTX
Correlation and Regression
PDF
Scatter diagram
ODP
Scatter diagrams and correlation
PPT
scatter diagram
PPTX
Correlation & Regression
PDF
Pearson Correlation, Spearman Correlation &Linear Regression
PPTX
Formation evaluation and well log correlation
PPT
Data organization and presentation (statistics for research)
PPTX
Correlation ppt...
PDF
correlation_and_covariance
PPTX
What is a Point Biserial Correlation?
Correlation Statistics
Correlation and Regression
Scatter diagram
Scatter diagrams and correlation
scatter diagram
Correlation & Regression
Pearson Correlation, Spearman Correlation &Linear Regression
Formation evaluation and well log correlation
Data organization and presentation (statistics for research)
Correlation ppt...
correlation_and_covariance
What is a Point Biserial Correlation?
Ad

Similar to Correlation in Statistics (20)

PDF
Lect w8 w9_correlation_regression
PPTX
Correlation and Regression2_student.pptx
PPTX
Measure of Association
PPTX
Simple egression.pptx
PPTX
Simple Linear Regression.pptx
PPTX
rugs koco.pptx
PPTX
Scatterplots, Correlation, and Regression
PDF
Introduction to correlation and regression analysis
PPTX
Unit-III Correlation and Regression.pptx
PPTX
regression.pptx
PDF
R nonlinear least square
PPTX
6 the six uContinuous data analysis.pptx
DOCX
EXERCISE 23 PEARSONS PRODUCT-MOMENT CORRELATION COEFFICIENT .docx
PPTX
numerical method chapter 5.pptxggggvvgggbg
PPTX
Statistical analysis in SPSS_
PPTX
Correlation.pptx
PPTX
PDF
Comparing the methods of Estimation of Three-Parameter Weibull distribution
PPT
Correlation and Regression analysis .ppt
PDF
Correlation.pdf
Lect w8 w9_correlation_regression
Correlation and Regression2_student.pptx
Measure of Association
Simple egression.pptx
Simple Linear Regression.pptx
rugs koco.pptx
Scatterplots, Correlation, and Regression
Introduction to correlation and regression analysis
Unit-III Correlation and Regression.pptx
regression.pptx
R nonlinear least square
6 the six uContinuous data analysis.pptx
EXERCISE 23 PEARSONS PRODUCT-MOMENT CORRELATION COEFFICIENT .docx
numerical method chapter 5.pptxggggvvgggbg
Statistical analysis in SPSS_
Correlation.pptx
Comparing the methods of Estimation of Three-Parameter Weibull distribution
Correlation and Regression analysis .ppt
Correlation.pdf

More from Avjinder (Avi) Kaler (20)

PDF
Unleashing Real-World Simulations: A Python Tutorial by Avjinder Kaler
PDF
Tutorial for Deep Learning Project with Keras
PDF
Tutorial for DBSCAN Clustering in Machine Learning
PDF
Python Code for Classification Supervised Machine Learning.pdf
PDF
Sql tutorial for select, where, order by, null, insert functions
PDF
Kaler et al 2018 euphytica
PDF
Association mapping identifies loci for canopy coverage in diverse soybean ge...
PDF
Genome-Wide Association Mapping of Carbon Isotope and Oxygen Isotope Ratios i...
PDF
Genome-wide association mapping of canopy wilting in diverse soybean genotypes
PDF
Tutorial for Estimating Broad and Narrow Sense Heritability using R
PDF
Tutorial for Circular and Rectangular Manhattan plots
PDF
Genomic Selection with Bayesian Generalized Linear Regression model using R
PDF
Genome wide association mapping
PDF
Nutrient availability response to sulfur amendment in histosols having variab...
PDF
Sugarcane yield and plant nutrient response to sulfur amended everglades hist...
PDF
R code descriptive statistics of phenotypic data by Avjinder Kaler
PDF
Population genetics
PDF
Quantitative genetics
PDF
Abiotic stresses in plant
PDF
Seed rate calculation for experiment
Unleashing Real-World Simulations: A Python Tutorial by Avjinder Kaler
Tutorial for Deep Learning Project with Keras
Tutorial for DBSCAN Clustering in Machine Learning
Python Code for Classification Supervised Machine Learning.pdf
Sql tutorial for select, where, order by, null, insert functions
Kaler et al 2018 euphytica
Association mapping identifies loci for canopy coverage in diverse soybean ge...
Genome-Wide Association Mapping of Carbon Isotope and Oxygen Isotope Ratios i...
Genome-wide association mapping of canopy wilting in diverse soybean genotypes
Tutorial for Estimating Broad and Narrow Sense Heritability using R
Tutorial for Circular and Rectangular Manhattan plots
Genomic Selection with Bayesian Generalized Linear Regression model using R
Genome wide association mapping
Nutrient availability response to sulfur amendment in histosols having variab...
Sugarcane yield and plant nutrient response to sulfur amended everglades hist...
R code descriptive statistics of phenotypic data by Avjinder Kaler
Population genetics
Quantitative genetics
Abiotic stresses in plant
Seed rate calculation for experiment

Recently uploaded (20)

PPTX
PPT- ENG7_QUARTER1_LESSON1_WEEK1. IMAGERY -DESCRIPTIONS pptx.pptx
PDF
Physiotherapy_for_Respiratory_and_Cardiac_Problems WEBBER.pdf
PPTX
master seminar digital applications in india
PDF
Anesthesia in Laparoscopic Surgery in India
PDF
102 student loan defaulters named and shamed – Is someone you know on the list?
PDF
Supply Chain Operations Speaking Notes -ICLT Program
PPTX
BOWEL ELIMINATION FACTORS AFFECTING AND TYPES
PDF
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
PDF
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
PDF
grade 11-chemistry_fetena_net_5883.pdf teacher guide for all student
PDF
BÀI TẬP BỔ TRỢ 4 KỸ NĂNG TIẾNG ANH 9 GLOBAL SUCCESS - CẢ NĂM - BÁM SÁT FORM Đ...
PDF
Chapter 2 Heredity, Prenatal Development, and Birth.pdf
PDF
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
PPTX
Cell Types and Its function , kingdom of life
PPTX
Week 4 Term 3 Study Techniques revisited.pptx
PPTX
Introduction_to_Human_Anatomy_and_Physiology_for_B.Pharm.pptx
PPTX
Pharmacology of Heart Failure /Pharmacotherapy of CHF
PDF
2.FourierTransform-ShortQuestionswithAnswers.pdf
PPTX
Pharma ospi slides which help in ospi learning
PPTX
school management -TNTEU- B.Ed., Semester II Unit 1.pptx
PPT- ENG7_QUARTER1_LESSON1_WEEK1. IMAGERY -DESCRIPTIONS pptx.pptx
Physiotherapy_for_Respiratory_and_Cardiac_Problems WEBBER.pdf
master seminar digital applications in india
Anesthesia in Laparoscopic Surgery in India
102 student loan defaulters named and shamed – Is someone you know on the list?
Supply Chain Operations Speaking Notes -ICLT Program
BOWEL ELIMINATION FACTORS AFFECTING AND TYPES
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
grade 11-chemistry_fetena_net_5883.pdf teacher guide for all student
BÀI TẬP BỔ TRỢ 4 KỸ NĂNG TIẾNG ANH 9 GLOBAL SUCCESS - CẢ NĂM - BÁM SÁT FORM Đ...
Chapter 2 Heredity, Prenatal Development, and Birth.pdf
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
Cell Types and Its function , kingdom of life
Week 4 Term 3 Study Techniques revisited.pptx
Introduction_to_Human_Anatomy_and_Physiology_for_B.Pharm.pptx
Pharmacology of Heart Failure /Pharmacotherapy of CHF
2.FourierTransform-ShortQuestionswithAnswers.pdf
Pharma ospi slides which help in ospi learning
school management -TNTEU- B.Ed., Semester II Unit 1.pptx

Correlation in Statistics

  • 2.  Linear correlation coefficient, r, is a number that measures how well paired sample data fit a straight-line pattern when graphed.  Using paired sample data (sometimes called bivariate data), we find the value of r (usually using technology), then we use that value to conclude that there is (or is not) a linear correlation between the two variables.  In this section we will consider only linear relationships, which means that when graphed, the points approximate a straight-line pattern.  We will discuss methods of hypothesis testing for correlation.
  • 3. Correlation – a correlation exists between two variables when the values of one variable are somehow associated with the values of the other variable. • Can be positive, negative, non-existent, or non-linear • A linear correlation exists between two variables when there is a correlation and the plotted points of paired data result in a pattern that can be approximated by a straight line.
  • 4. We can often see a relationship between two variables by constructing a scatterplot. Scatter plots of paired data
  • 6. 1. The sample of paired data is a Simple Random Sample of quantitative data 2. The pairs of data ( 𝑥,𝑦) have a bivariate normal distribution, meaning the following: • Visual examination of the scatter plot(s) confirms that the sample points follow an approximately straight line(s) • Because results can be strongly affected by the presence of outliers, any outliers should be removed if they are known to be errors (Note: Use caution when removing data points) Note: These are the same as the Requirements for Simple Linear Regression.
  • 7. • Linear Correlation Coefficient (𝑟) – measures the strength of the linear correlation between the paired quantitative 𝑥 and 𝑦 values in a sample • Also known as the Pearson Product Moment Correlation Coefficient in honor of Karl Pearson • This is a Sample Statistic of the correlation that is linear between 𝑥 and 𝑦 • If this value is squared, the value is the Coefficient of Determination ( 𝑟2) • Notation: • 𝑟 : linear correlation coefficient for sample data • 𝜌 : linear correlation coefficient for a population of paired data • Formula for calculating 𝑟: 𝑟 = 𝑛 Σ𝑥𝑦 − Σ𝑥 ∗(Σ𝑦) 𝑛(Σ𝑥2)− Σ𝑥 2∗ 𝑛(Σ𝑦2)− Σ𝑦 2
  • 8. 1. – 1 ≤ r ≤ 1 2. If all values of either variable are converted to a different scale, the value of r does not change. 3. The value of r is not affected by the choice of x and y. 4. r measures strength of a linear relationship. 5. r is very sensitive to outliers • A single outlier can dramatically affect the value of r
  • 9. • The value of 𝑟2 is the proportion of the variation in 𝑦 that is explained by the linear relationship that exists between 𝑥 and 𝑦 • Thus, 𝑟2 is, also, the amount of variation in 𝑦 that is explained by the regression line itself • We may use 𝑟2 to describe the predictive power of the regression equation
  • 10. • To conclude that correlation implies causality • Using data based on averages • This type of data causes an inflated correlation coefficient • To conclude that if there is no linear correlation, there is no correlation at all
  • 11. Hypotheses: 𝐻0: 𝜌 = 0 𝐻1: 𝜌 ≠ 0 (𝑛𝑜 𝑙𝑖𝑛𝑒𝑎𝑟 𝑐𝑜𝑟𝑟𝑒𝑙𝑎𝑡𝑖𝑜𝑛) (𝑙𝑖𝑛𝑒𝑎𝑟 𝑐𝑜𝑟𝑟𝑒𝑙𝑎𝑡𝑖𝑜𝑛 𝑒𝑥𝑖𝑠𝑡𝑠) These hypotheses can be equivalently tested with the following hypotheses: 𝐻0: 𝛽1 = 0 𝐻1: 𝛽1 ≠ 0 (𝑛𝑜 𝑙𝑖𝑛𝑒𝑎𝑟 𝑐𝑜𝑟𝑟𝑒𝑙𝑎𝑡𝑖𝑜𝑛) (𝑙𝑖𝑛𝑒𝑎𝑟 𝑐𝑜𝑟𝑟𝑒𝑙𝑎𝑡𝑖𝑜𝑛 𝑒𝑥𝑖𝑠𝑡𝑠) Note: This equivalence will be important for the interpretation of the technological output.
  • 12. • Use Critical Value from Table A-6 (this is a simpler approach) and think of the Linear Correlation Coefficient (𝑟) as a ‘test statistic’ • OR, use the following t-score test statistic with 𝑑𝑓 = 𝑛 − 2 𝑡 = 𝑟 1 − 𝑟2 𝑛 − 2 • This 𝑡 test statistic can be viewed in most technological output corresponding to the test of significance for the slope in the regression line (i.e. the second set of equivalent hypotheses listed above)
  • 13. • If using critical values from Table A-6: 𝐼𝑓 𝑟 > 𝑐𝑟𝑖𝑡𝑖𝑐𝑎𝑙 𝑣𝑎𝑙𝑢𝑒, 𝑟𝑒𝑗𝑒𝑐𝑡 𝐻0 𝐼𝑓 𝑟 ≤ 𝑐𝑟𝑖𝑡𝑖𝑐𝑎𝑙 𝑣𝑎𝑙𝑢𝑒, 𝑓𝑎𝑖𝑙 𝑡𝑜 𝑟𝑒𝑗𝑒𝑐𝑡 𝐻0 This test is obviously a two-tailed hypothesis test based on the alternative hypothesis. Visualize this by plotting the possible values for 𝑟 on a number line with a labeled critical region. • If using the t-score test statistic: • Use statistical software to calculate the correct p-value that corresponds with the test statistic. Then, base the conclusion on comparison between the p-value and 𝛼
  • 14. One-tailed tests can occur with a claim of a positive linear correlation or a claim of a negative linear correlation. In such cases, the hypotheses will be as shown here. For these one-tailed tests, the P-value method can be used as well.
  • 15. • Construct a scatter plot and verify that the pattern of the points is approximately a straight line pattern without outliers • Assess the linear correlation between two variables of interest and create a regression equation • Consider any effects of a pattern over time • Perform a Residual Analysis: • Construct a residual plot and verify that there is no pattern (other than a straight line pattern) and also verify that the residual plot does not become thicker or thinner • Use a histogram, normal quantile plot, or Shapiro Wilk test of normality to confirm that the values of the residuals have a distribution that is approximately normal
  • 16. • Measurement Error – could be described as ‘explainable’ outliers • Nonlinear Associations – ignoring possible nonlinear relationships • Extrapolation – predicting far beyond the scope of our available data
  • 17. The paired shoe / height data from five males are listed below. Using StatCrunch, find the value of the correlation coefficient r.
  • 18. Requirement Check: The data are a simple random sample of quantitative data, the plotted points appear to roughly approximate a straight-line pattern, and there are no outliers.
  • 19. A few technologies are displayed below, used to calculate the value of r.
  • 20. We found previously for the shoe and height example that r = 0.591. With r = 0.591, we get r2 = 0.349. We conclude that about 34.9% of the variation in height can be explained by the linear relationship between lengths of shoe prints and heights.
  • 21. Conduct a formal hypothesis test of the claim that there is a linear correlation between the two variables. Use a 0.05 significance level.
  • 22. We test the claim: 𝐻0: 𝛽1 = 0 𝐻1: 𝛽1 ≠ 0 (𝑛𝑜 𝑙𝑖𝑛𝑒𝑎𝑟 𝑐𝑜𝑟𝑟𝑒𝑙𝑎𝑡𝑖𝑜𝑛) (𝑙𝑖𝑛𝑒𝑎𝑟 𝑐𝑜𝑟𝑟𝑒𝑙𝑎𝑡𝑖𝑜𝑛 𝑒𝑥𝑖𝑠𝑡𝑠)
  • 23. We calculate the test statistic: Table A-3 shows this test statistic yields a p-value that is greater than 0.20. 2 2 0.591 1.269 1 1 0.591 2 5 2 r t r n       
  • 24. StatCrunch provides a P-value of 0.2937. Because the p-value of 0.2937 is greater than the significance level of 0.05, we fail to reject the null hypothesis. We conclude there is not sufficient evidence to support the claim that there is a linear correlation between shoe print length and heights of males.
  • 25. With the test statistic, r = 0.591. The critical values of r = ± 0.878 are found in Table A-6 with n = 5 and α = 0.05. We fail to reject the null and conclude there is not sufficient evidence to support the claim that there is a linear correlation between shoe print length and heights of males.
  • 26. Use the 5 pairs of shoe print lengths and heights to predict the height of a person with a shoe print length of 29 cm. The regression line does not fit the points well. The correlation is r = 0.591, which suggests there is not a linear correlation (the p-value was 0.2937). From StatCrunch, The best predicted height is simply the mean of the sample heights: 177.3 cmy 
  • 27. Use the 40 pairs of shoe print lengths from Data Set 2 in Appendix B to predict the height of a person with a shoe print length of 29 cm. Now, the regression line does fit the points well, and the correlation of r = 0.813 suggests that there is a linear correlation since the p-value is < 0.0001.
  • 28. The regression equation and scatterplot are shown below:
  • 29. The given shoe length of 29 cm is not beyond the scope of the available data, so substitute in 29 cm into the regression model: A person with a shoe length of 29 cm is predicted to be 174.3 cm tall. Using StatCrunch,   ˆ 80.9 3.22 80.9 3.22 29 174.3 cm y x    
  • 30. What if we have two or more explanatory variables? Do we have a method for this? YES!! Of course, we do. We may want to predict a sea turtle’s lifespan by more variables than simply length of shell! I want to use variables that account for the diet, exercise, mental health, and captivity status of the turtle! What about variables that also account for the water quality in the turtle’s surrounding environment? I want that too!!! To do this, we can use something known as Multiple Linear Regression!