SlideShare a Scribd company logo
Quantitative Analysis for BusinessLecture 5August 9th, 2010
Multiple Regression AnalysisMultiple regression models are extensions to the simple linear model and allow the creation of models with several independent variablesY = 0 + 1X1 + 2X2 + … + kXk + whereY =	dependent variable (response variable)Xi =	ith independent variable (predictor or explanatory variable)0 =	intercept (value of Y when all Xi= 0)I =	coefficient of the ith independent variablek =	number of independent variables =	random error
where	 =	predicted value of Yb0 =	sample intercept (and is an estimate of 0)bi=	sample coefficient of the ith variable (and is an estimate of i)Multiple Regression AnalysisTo estimate these values, a sample is taken the following equation developedInterpreting Multiple Regression ResultsIntercept – the value of the dependent variable when the independent variables are all equal to zeroEach slope coefficient – the estimated change in the dependent variable for a one-unit change in the independent variable, holding the other independent variables contantSometimes it’s called “partial slope coefficients”
Interpreting example10-year real earnings growth of S&P500 (EG10)Intercept termIf dividend payout ratio (PR) is zero and the slope of the yield curve (YC) is zero, we would expect the subsequent 10-year real earnings growth rate to be -11.6%  interceptSlope coefficient of PRIf they payout ratio increases by 1%, we would expect the subsequent 10-year earnings growth rate to increase by 0.25%, holding YC constantSlope coefficient of YCIf the yield curve slope increases by 1%, we would expect the subsequent 10-year earnings growth rate to increase by 0.14%, holding PR constant
Hypothesis testing of regression coefficientst-statistic – used to test the significance of the individual coefficient in a multiple regressiont-statistic has n-k-1 degrees of freedomEstimated regression coefficient – hypothesized valueCoefficient standard error of bj
Ex: testing the statistical significance of a regression coefficientTest the statistical significance of the independent variable PR in the real earnings growth example at the 10% significance level. Data based on 46 observations
Ex: testing the statistical significance of a regression coefficientWe are testing the following hypothesis:The 10% two-tailed critical t-value with 43 degree of freedom (46-2-1) is approximately 1.68We should reject the hypothesis if the t-statistic is greater than 1.68 or less than -1.68Greater than 1.68, we can reject the null hypothesis and conclude that PR regression coefficient is statistically significant a the 10% significant level
where	 =	predicted value of dependent variable (selling price)b0 =	Y interceptX1 and X2 =	value of the two independent variables (square footage and age) respectivelyb1 andb2 =	slopes for X1 and X2 respectivelyExample: Jenny Wilson RealtyJenny Wilson wants to develop a model to determine the suggested listing price for houses based on the size and age of the house
She selects a sample of houses that have sold recently and records the data shown in Table 4.5Jenny Wilson RealtyTable 4.5
Jenny Wilson Realty
Evaluating Multiple Regression ModelsEvaluation is similar to simple linear regression models
The p-value for the F-test and r2 are interpreted the same
The hypothesis is different because there is more than one independent variable
The F-test is investigating whether all the coefficients are equal to 0
p-value – the smallest level of significance for which the null hypothesis can be rejected
p-value < significance level
Reject null hypothesis
p-value > significance level
Null hypothesis cannot be rejectedEvaluating Multiple Regression ModelsTo determine which independent variables are significant, tests are performed for each variable
The test statistic is calculated and if the p-value is lower than the level of significance (), the null hypothesis is rejectedEx: interpreting p-valuesGiven the following regression results, determine which regression parameters for the independent variables are statistically significantly different from zero at the 1% significant level, assuming the sample size is 60The independent variable is statistically significant if the p-value is less than 1%, or 0.01X1 and X3 are statistically significantly different than zero
F-statisticF-test assesses how well the set of independent variables, as a group, explains the variation of the dependent variableF-statistic is used to test whether at least one of the independent variables explains a significant portion of the variation of the dependent variable
F-statisticF-statistic is calculated asWhere:SSR = Sum of Square of RegressionSSE = Sum of Square of ErrorsMSR = Mean Regression Sum of SquaresMSE = Mean Squared ErrorReject H0 if F-statistic > Fc (critical value)
EX: calculating and interpreting f-statisticAn analyst runs a regression of monthly value-stock returns on five independent variables over 60 months. The total sum of squares is 460, and the sum of squared errors is 170. Test the null hypothesis at the 5% significance level that all five of the independent variables are equal to zeroThe critical F-value for 5 and 54 degrees of freedom at 5% significance level is approximately 2.40
EX: calculating and interpreting f-statisticThe null and alternative hypothesis areCalculationsF-statistic > F-criticalWe reject null hypothesis!At least one independent variable is significantly different than zero
EX: Jenny Wilson RealtyThe model is statistically significant
The p-value for the F-test is 0.002
r2 = 0.6719 so the model explains about 67% of the variation in selling price (Y)
But the F-test is for the entire model and we can’t tell if one or both of the independent variables are significant
By calculating the p-value of each variable, we can assess the significance of the individual variables
Since the p-value for X1 (square footage) and X2 (age) are both less than the significance level of 0.05, both null hypotheses can be rejectedCoefficient of determination (R2)Multiple coefficient of determination, R2, can be used to test the overall effectiveness of the entire set of independent variables in explaining the dependent variable.
Adjusted R2Unfortunately, R2 by itself may not be a reliable measure of the multiple regression modelR2almost always increases as variables are added to the modelWe need to take new variables into accountWheren = number of observationsk = number of independent variablesRa2 = adjusted R2
Adjusted R2Whenever there is more than 1 independent variableRa2 is less than or equal to R2So adding new variables to the model will increase R2 but may increase or decrease the Ra2Ra2 maybe less than 0 if R2 is low enough
EX: adjusted R2An analyst runs a regression model of monthly value-stock returns on five independent variables over 60 months. The total sum of squares for the regression is 460, and the sum of squared errors is 170. Calculate Ra2 = adjusted R2
The R2 of 63% suggests that the five independent variables together explain 63% of the variation in monthly value-stock returnsEX: adjusted R2Suppose the analyst now adds four more independent variables to the regression and the R2 increases to 65%, which model the analyst would most likely prefer?
The analyst would prefer the first model because the adjusted Ra2 is higher and the model has five independent variables as opposed to nineBinary or Dummy VariablesBinary (or dummy or indicator) variables are special variables created for qualitative data
A dummy variable is assigned a value of 1 if a particular condition is met and a value of 0 otherwise
The number of dummy variables must equal one less than the number of categories of the qualitative variableJenny Wilson RealtyJenny’s Qualitative Category  ConditionExcellent
Mint
GoodTable 4.5
Jenny Wilson RealtyJenny believes a better model can be developed if she includes information about the condition of the propertyX3	= 1 if house is in excellent condition	= 0 otherwiseX4	= 1 if house is in mint condition	= 0 otherwiseTwo dummy variables are used to describe the three categories of condition
No variable is needed for “good” condition since if both X3 and X4 = 0, the house must be in good conditionJenny Wilson Realty

More Related Content

PPTX
Recep maz msb 701 quantitative analysis for managers
PDF
Quantitative Methods - Level II - CFA Program
PPT
CFA II Quantitative Analysis
PPTX
What is Simple Linear Regression and How Can an Enterprise Use this Technique...
PPTX
What is Simple Linear Regression and How Can an Enterprise Use this Technique...
PPTX
Multiple Linear Regression
PPTX
What is Multiple Linear Regression and How Can it be Helpful for Business Ana...
DOCX
Basics of statistical notation
Recep maz msb 701 quantitative analysis for managers
Quantitative Methods - Level II - CFA Program
CFA II Quantitative Analysis
What is Simple Linear Regression and How Can an Enterprise Use this Technique...
What is Simple Linear Regression and How Can an Enterprise Use this Technique...
Multiple Linear Regression
What is Multiple Linear Regression and How Can it be Helpful for Business Ana...
Basics of statistical notation

What's hot (18)

PDF
Data Science - Part IV - Regression Analysis & ANOVA
PPTX
What is Multiple Linear Regression and How Can it be Helpful for Business Ana...
PPT
Solving stepwise regression problems
PDF
Basic probability theory and statistics
PPT
Logistic Regression in Case-Control Study
PPTX
Regression presentation
PPTX
Econometrics chapter 8
PPTX
Multicollinearity PPT
PPT
Chapter 15
PDF
Data Analyst - Interview Guide
PPTX
Applications of regression analysis - Measurement of validity of relationship
PPTX
Managerial Economics - Demand Estimation (regression analysis)
PPTX
Machine learning session4(linear regression)
PPTX
Simple linear regression
PPTX
Regression analysis
PPTX
Me ppt
PPT
Regression analysis
PPTX
Correlation and regression
Data Science - Part IV - Regression Analysis & ANOVA
What is Multiple Linear Regression and How Can it be Helpful for Business Ana...
Solving stepwise regression problems
Basic probability theory and statistics
Logistic Regression in Case-Control Study
Regression presentation
Econometrics chapter 8
Multicollinearity PPT
Chapter 15
Data Analyst - Interview Guide
Applications of regression analysis - Measurement of validity of relationship
Managerial Economics - Demand Estimation (regression analysis)
Machine learning session4(linear regression)
Simple linear regression
Regression analysis
Me ppt
Regression analysis
Correlation and regression
Ad

Viewers also liked (8)

PPTX
IBE303 Lecture 5
PPTX
IBE303 Lecture 10
PPTX
IBE303 International Economic Lecture 4
PPTX
IBM401 Lecture 10
PPTX
IBE303 Midterm key
PPTX
IBM401 Midterm key
PPTX
IBM401 - Lecture 6
PPTX
IBE303 - Lecture 6
IBE303 Lecture 5
IBE303 Lecture 10
IBE303 International Economic Lecture 4
IBM401 Lecture 10
IBE303 Midterm key
IBM401 Midterm key
IBM401 - Lecture 6
IBE303 - Lecture 6
Ad

Similar to IBM401 Lecture 5 (20)

PPTX
Recep maz msb 701 quantitative analysis for managers
PPT
Regression_Analysis_Handout_(Methodology_Part_1).ppt
PDF
Group 5 - Regression Analysis.pdf
PPTX
Regression_Analysis_Handout_(Methodology_Part_1).pptx
PPTX
REGRESSION AND EXPLORATORY FACTOR ANALYSIS
PPTX
manecohuhuhuhubasicEstimation-1.pptx
PPT
linear Regression, multiple Regression and Annova
PPT
Multiple Regression with examples112.ppt
PPT
Multiple Regression.ppt
PPT
Multiple Regression.ppt
PPTX
IBM401 Lecture 12
PPTX
Business Quantitative Lecture 3
PPT
Ch4 slides
PPT
Multiple Regression
PPT
Ders 2 ols .ppt
PPT
Chapter14
PPT
Multiple regression presentation
PDF
Applied statistics lecture_6
PPT
Regression analysis
PPT
A presentation for Multiple linear regression.ppt
Recep maz msb 701 quantitative analysis for managers
Regression_Analysis_Handout_(Methodology_Part_1).ppt
Group 5 - Regression Analysis.pdf
Regression_Analysis_Handout_(Methodology_Part_1).pptx
REGRESSION AND EXPLORATORY FACTOR ANALYSIS
manecohuhuhuhubasicEstimation-1.pptx
linear Regression, multiple Regression and Annova
Multiple Regression with examples112.ppt
Multiple Regression.ppt
Multiple Regression.ppt
IBM401 Lecture 12
Business Quantitative Lecture 3
Ch4 slides
Multiple Regression
Ders 2 ols .ppt
Chapter14
Multiple regression presentation
Applied statistics lecture_6
Regression analysis
A presentation for Multiple linear regression.ppt

More from saark (16)

PDF
Ibe303 grade
PDF
Ibm401 grade
PPTX
IBE303 Lecture 12
PPTX
IBM401 Lecture 11
PPTX
IBE303 Lecture 11
PPTX
IBE303 Lecture 9
PPTX
IBM401 Lecture 9
PPTX
IBM401 Lecture 8
PPTX
IBE303 Lecture 8
PPTX
IBM401 Lecture 7
PPTX
IBE303 Lecture 7
PPTX
IBM401 Assignment
PPTX
IBE303 Assignment
PPTX
International Economic Lecture 3
PPTX
International Economic Lecture 2
PPTX
Business Quantitative - Lecture 2
Ibe303 grade
Ibm401 grade
IBE303 Lecture 12
IBM401 Lecture 11
IBE303 Lecture 11
IBE303 Lecture 9
IBM401 Lecture 9
IBM401 Lecture 8
IBE303 Lecture 8
IBM401 Lecture 7
IBE303 Lecture 7
IBM401 Assignment
IBE303 Assignment
International Economic Lecture 3
International Economic Lecture 2
Business Quantitative - Lecture 2

Recently uploaded (20)

PPTX
Tissue processing ( HISTOPATHOLOGICAL TECHNIQUE
PPTX
Cell Structure & Organelles in detailed.
PDF
O7-L3 Supply Chain Operations - ICLT Program
PPTX
Pharma ospi slides which help in ospi learning
PDF
01-Introduction-to-Information-Management.pdf
PDF
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
PPTX
master seminar digital applications in india
PDF
Chinmaya Tiranga quiz Grand Finale.pdf
PDF
RMMM.pdf make it easy to upload and study
PDF
FourierSeries-QuestionsWithAnswers(Part-A).pdf
PDF
Computing-Curriculum for Schools in Ghana
PDF
grade 11-chemistry_fetena_net_5883.pdf teacher guide for all student
PPTX
Microbial diseases, their pathogenesis and prophylaxis
PPTX
school management -TNTEU- B.Ed., Semester II Unit 1.pptx
PDF
3rd Neelam Sanjeevareddy Memorial Lecture.pdf
PDF
2.FourierTransform-ShortQuestionswithAnswers.pdf
PDF
Module 4: Burden of Disease Tutorial Slides S2 2025
PDF
GENETICS IN BIOLOGY IN SECONDARY LEVEL FORM 3
PDF
102 student loan defaulters named and shamed – Is someone you know on the list?
PDF
A systematic review of self-coping strategies used by university students to ...
Tissue processing ( HISTOPATHOLOGICAL TECHNIQUE
Cell Structure & Organelles in detailed.
O7-L3 Supply Chain Operations - ICLT Program
Pharma ospi slides which help in ospi learning
01-Introduction-to-Information-Management.pdf
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
master seminar digital applications in india
Chinmaya Tiranga quiz Grand Finale.pdf
RMMM.pdf make it easy to upload and study
FourierSeries-QuestionsWithAnswers(Part-A).pdf
Computing-Curriculum for Schools in Ghana
grade 11-chemistry_fetena_net_5883.pdf teacher guide for all student
Microbial diseases, their pathogenesis and prophylaxis
school management -TNTEU- B.Ed., Semester II Unit 1.pptx
3rd Neelam Sanjeevareddy Memorial Lecture.pdf
2.FourierTransform-ShortQuestionswithAnswers.pdf
Module 4: Burden of Disease Tutorial Slides S2 2025
GENETICS IN BIOLOGY IN SECONDARY LEVEL FORM 3
102 student loan defaulters named and shamed – Is someone you know on the list?
A systematic review of self-coping strategies used by university students to ...

IBM401 Lecture 5

  • 1. Quantitative Analysis for BusinessLecture 5August 9th, 2010
  • 2. Multiple Regression AnalysisMultiple regression models are extensions to the simple linear model and allow the creation of models with several independent variablesY = 0 + 1X1 + 2X2 + … + kXk + whereY = dependent variable (response variable)Xi = ith independent variable (predictor or explanatory variable)0 = intercept (value of Y when all Xi= 0)I = coefficient of the ith independent variablek = number of independent variables = random error
  • 3. where = predicted value of Yb0 = sample intercept (and is an estimate of 0)bi= sample coefficient of the ith variable (and is an estimate of i)Multiple Regression AnalysisTo estimate these values, a sample is taken the following equation developedInterpreting Multiple Regression ResultsIntercept – the value of the dependent variable when the independent variables are all equal to zeroEach slope coefficient – the estimated change in the dependent variable for a one-unit change in the independent variable, holding the other independent variables contantSometimes it’s called “partial slope coefficients”
  • 4. Interpreting example10-year real earnings growth of S&P500 (EG10)Intercept termIf dividend payout ratio (PR) is zero and the slope of the yield curve (YC) is zero, we would expect the subsequent 10-year real earnings growth rate to be -11.6%  interceptSlope coefficient of PRIf they payout ratio increases by 1%, we would expect the subsequent 10-year earnings growth rate to increase by 0.25%, holding YC constantSlope coefficient of YCIf the yield curve slope increases by 1%, we would expect the subsequent 10-year earnings growth rate to increase by 0.14%, holding PR constant
  • 5. Hypothesis testing of regression coefficientst-statistic – used to test the significance of the individual coefficient in a multiple regressiont-statistic has n-k-1 degrees of freedomEstimated regression coefficient – hypothesized valueCoefficient standard error of bj
  • 6. Ex: testing the statistical significance of a regression coefficientTest the statistical significance of the independent variable PR in the real earnings growth example at the 10% significance level. Data based on 46 observations
  • 7. Ex: testing the statistical significance of a regression coefficientWe are testing the following hypothesis:The 10% two-tailed critical t-value with 43 degree of freedom (46-2-1) is approximately 1.68We should reject the hypothesis if the t-statistic is greater than 1.68 or less than -1.68Greater than 1.68, we can reject the null hypothesis and conclude that PR regression coefficient is statistically significant a the 10% significant level
  • 8. where = predicted value of dependent variable (selling price)b0 = Y interceptX1 and X2 = value of the two independent variables (square footage and age) respectivelyb1 andb2 = slopes for X1 and X2 respectivelyExample: Jenny Wilson RealtyJenny Wilson wants to develop a model to determine the suggested listing price for houses based on the size and age of the house
  • 9. She selects a sample of houses that have sold recently and records the data shown in Table 4.5Jenny Wilson RealtyTable 4.5
  • 11. Evaluating Multiple Regression ModelsEvaluation is similar to simple linear regression models
  • 12. The p-value for the F-test and r2 are interpreted the same
  • 13. The hypothesis is different because there is more than one independent variable
  • 14. The F-test is investigating whether all the coefficients are equal to 0
  • 15. p-value – the smallest level of significance for which the null hypothesis can be rejected
  • 19. Null hypothesis cannot be rejectedEvaluating Multiple Regression ModelsTo determine which independent variables are significant, tests are performed for each variable
  • 20. The test statistic is calculated and if the p-value is lower than the level of significance (), the null hypothesis is rejectedEx: interpreting p-valuesGiven the following regression results, determine which regression parameters for the independent variables are statistically significantly different from zero at the 1% significant level, assuming the sample size is 60The independent variable is statistically significant if the p-value is less than 1%, or 0.01X1 and X3 are statistically significantly different than zero
  • 21. F-statisticF-test assesses how well the set of independent variables, as a group, explains the variation of the dependent variableF-statistic is used to test whether at least one of the independent variables explains a significant portion of the variation of the dependent variable
  • 22. F-statisticF-statistic is calculated asWhere:SSR = Sum of Square of RegressionSSE = Sum of Square of ErrorsMSR = Mean Regression Sum of SquaresMSE = Mean Squared ErrorReject H0 if F-statistic > Fc (critical value)
  • 23. EX: calculating and interpreting f-statisticAn analyst runs a regression of monthly value-stock returns on five independent variables over 60 months. The total sum of squares is 460, and the sum of squared errors is 170. Test the null hypothesis at the 5% significance level that all five of the independent variables are equal to zeroThe critical F-value for 5 and 54 degrees of freedom at 5% significance level is approximately 2.40
  • 24. EX: calculating and interpreting f-statisticThe null and alternative hypothesis areCalculationsF-statistic > F-criticalWe reject null hypothesis!At least one independent variable is significantly different than zero
  • 25. EX: Jenny Wilson RealtyThe model is statistically significant
  • 26. The p-value for the F-test is 0.002
  • 27. r2 = 0.6719 so the model explains about 67% of the variation in selling price (Y)
  • 28. But the F-test is for the entire model and we can’t tell if one or both of the independent variables are significant
  • 29. By calculating the p-value of each variable, we can assess the significance of the individual variables
  • 30. Since the p-value for X1 (square footage) and X2 (age) are both less than the significance level of 0.05, both null hypotheses can be rejectedCoefficient of determination (R2)Multiple coefficient of determination, R2, can be used to test the overall effectiveness of the entire set of independent variables in explaining the dependent variable.
  • 31. Adjusted R2Unfortunately, R2 by itself may not be a reliable measure of the multiple regression modelR2almost always increases as variables are added to the modelWe need to take new variables into accountWheren = number of observationsk = number of independent variablesRa2 = adjusted R2
  • 32. Adjusted R2Whenever there is more than 1 independent variableRa2 is less than or equal to R2So adding new variables to the model will increase R2 but may increase or decrease the Ra2Ra2 maybe less than 0 if R2 is low enough
  • 33. EX: adjusted R2An analyst runs a regression model of monthly value-stock returns on five independent variables over 60 months. The total sum of squares for the regression is 460, and the sum of squared errors is 170. Calculate Ra2 = adjusted R2
  • 34. The R2 of 63% suggests that the five independent variables together explain 63% of the variation in monthly value-stock returnsEX: adjusted R2Suppose the analyst now adds four more independent variables to the regression and the R2 increases to 65%, which model the analyst would most likely prefer?
  • 35. The analyst would prefer the first model because the adjusted Ra2 is higher and the model has five independent variables as opposed to nineBinary or Dummy VariablesBinary (or dummy or indicator) variables are special variables created for qualitative data
  • 36. A dummy variable is assigned a value of 1 if a particular condition is met and a value of 0 otherwise
  • 37. The number of dummy variables must equal one less than the number of categories of the qualitative variableJenny Wilson RealtyJenny’s Qualitative Category  ConditionExcellent
  • 38. Mint
  • 40. Jenny Wilson RealtyJenny believes a better model can be developed if she includes information about the condition of the propertyX3 = 1 if house is in excellent condition = 0 otherwiseX4 = 1 if house is in mint condition = 0 otherwiseTwo dummy variables are used to describe the three categories of condition
  • 41. No variable is needed for “good” condition since if both X3 and X4 = 0, the house must be in good conditionJenny Wilson Realty
  • 42. Model explains about 90% of the variation in selling priceF-value indicates significanceLow p-values indicate each variable is significantJenny Wilson RealtyProgram 4.3
  • 43. Model BuildingThe best model is a statistically significant model with a high r2and few variables
  • 44. As more variables are added to the model, the r2-value usually increases
  • 45. For this reason, the adjusted r2 value is often used to determine the usefulness of an additional variable
  • 46. The adjusted r2 takes into account the number of independent variables in the modelThe formula for r2
  • 47. The formula for adjusted r2Model BuildingAs the number of variables increases, the adjusted r2 gets smaller unless the increase due to the new variable is large enough to offset the change in kModel BuildingIn general, if a new variable increases the adjusted r2, it should probably be included in the model
  • 48. In some cases, variables contain duplicate information
  • 49. When two independent variables are correlated, they are said to be collinear
  • 50. When more than two independent variables are correlated, multicollinearity exists
  • 51. When multicollinearity is present, hypothesis tests for the individual coefficients are not valid but the model may still be useful********************Linear relationshipNonlinear relationshipNonlinear RegressionIn some situations, variables are not linearTransformations may be used to turn a nonlinear model into a linear model
  • 52. Ex: Colonel MotorsThe engineers want to use regression analysis to improve fuel efficiency
  • 53. They have been asked to study the impact of weight on miles per gallon (MPG) Table 4.6
  • 54. 45 – 40 – 35 – 30 – 25 – 20 – 15 – 10 – 5 – 0 – MPG| | | | | 1.00 2.00 3.00 4.00 5.00Weight (1,000 lb.)Colonel MotorsLinear modelFigure 4.6A
  • 55. Colonel MotorsA useful model with a small F-test for significance and a good r2 value45 – 40 – 35 – 30 – 25 – 20 – 15 – 10 – 5 – 0 – MPG| | | | | 1.00 2.00 3.00 4.00 5.00Weight (1,000 lb.)Colonel MotorsNonlinear modelFigure 4.6B
  • 56. The nonlinear model is a quadratic model
  • 57. The easiest way to work with this model is to develop a new variable
  • 58. This gives us a model that can be solved with linear regression softwareColonel Motors
  • 59. Colonel MotorsA better model with a smaller F-test for significance and a larger adjusted r2 valueCautions and PitfallsIf the assumptions are not met, the statistical test may not be validCorrelation does not necessarily mean causationMulticollinearity makes interpreting coefficients problematic, but the model may still be goodUsing a regression model beyond the range of X is questionable, the relationship may not hold outside the sample data
  • 60. Cautions and Pitfallst-tests for the intercept (b0) may be ignored as this point is often outside the range of the modelA linear relationship may not be the best relationship, even if the F-test returns an acceptable valueA nonlinear relationship can exist even if a linear relationship does notJust because a relationship is statistically significant doesn't mean it has any practical value