SlideShare a Scribd company logo
UNIT 2
SUPERVISED LEARNING
[BME, CSE, ECE, EEE & Mechanical]
Linear Regression Models: Least squares, single & multiple
variables, Bayesian linear regression, gradient descent, Linear
Classification Models: Discriminant function – Perceptron
algorithm, Probabilistic discriminative model - Logistic
regression, Probabilistic generative model – Naïve Bayes,
Maximum margin classifier – Support vector machine,
Decision Tree, Random Forests.
1
Supervised Learning
• Supervised machine learning is a fundamental approach
for machine learning and artificial intelligence.
• It involves training a model using labeled data, where each
input comes with a corresponding correct output.
How it works?
3
Steps involved in Supervised Learning
• First Determine the type of training dataset
• Collect/Gather the labelled training data.
• Split the training dataset into training dataset, test dataset,
and validation dataset.
• Determine the input features of the training dataset, which
should have enough knowledge so that the model can
accurately predict the output.
• Determine the suitable algorithm for the model, such as
support vector machine, decision tree, etc.
4
Steps involved in Supervised Learning
• Execute the algorithm on the training dataset.
Sometimes we need validation sets as the control
parameters, which are the subset of training
datasets.
• Evaluate the accuracy of the model by providing
the test set. If the model predicts the correct
output, which means our model is accurate.
5
Types
6
Classification
• Classification algorithms are used when the output
variable is categorical, which means there are two
classes such as Yes-No, Male-Female, True-false, etc.
• Below are some popular Classification algorithms
which come under supervised learning:
– Random Forest
– Decision Trees
– Logistic Regression
– Support vector Machines
7
Regression
• Regression algorithms are used if there is a relationship between the input
variable and the output variable.
• It is used for the prediction of continuous variables, such as Weather
forecasting, Market Trends, etc.
• Below are some popular Regression algorithms which come under supervised
learning:
– Linear Regression
– Regression Trees
– Non-Linear Regression
– Bayesian Linear Regression
– Polynomial Regression
8
Linear Regression
• Linear regression is a statistical regression method which is used for
predictive analysis.
• It is one of the very simple and easy algorithms which works on
regression and shows the relationship between the continuous
variables.
• It is used for solving the regression problem in machine learning.
• Linear regression shows the linear relationship between the
independent variable (X-axis) and the dependent variable (Y-axis),
hence called linear regression.
9
Linear Regression-Contd
• If there is only one input variable (x), then such
linear regression is called simple linear regression.
• If there is more than one input variable, then such
linear regression is called multiple linear regression.
• The relationship between variables in the linear
regression model can be explained using the below
image.
10
Linear Regression-Contd
• Here we are predicting the salary of an employee on the
basis of the year of experience.
11
Least Square Method
• Least Squares method is a statistical technique used to find the
equation of best-fitting curve or line to a set of data points by
minimizing the sum of the squared differences between the
observed values and the values predicted by the model.
• This method aims at minimizing the sum of squares of
deviations as much as possible. The line obtained from such a
method is called a regression line or line of best fit.
12
Linear Regression-Contd
• Below is the mathematical equation for Linear regression:
Y= aX+b
Here,
Y= dependent variables (target variables) ,
X= Independent variables (predictor variables) ,
a and b are the linear coefficients.
13
Formula for Least Square Method
• Least Square Method formula is used to find the best-fitting
line through a set of data points.
• For a simple linear regression, which is a line of the form
y=mx+c,
Where,
y is the dependent variable,
x is the independent variable,
a is the slope of the line, and b is the y-intercept,
14
Formula for Least Square Method
• Formulas to calculate the slope (m) and intercept (c) of the line are
derived from the following equations:
• Slope (m) : m = n(∑xy)−(∑x)(∑y) / n(∑x2)−(∑x)2​
• Intercept (c) : c = (∑y)−a(∑x) / n​
Where:
• n is the number of data points,
• ∑xy is the sum of the product of each pair of x and y values,
• ∑x is the sum of all x values,
• ∑y is the sum of all y values,
• ∑x2 is the sum of the squares of x values.
15
Steps for Least Square Method
• Step 1: Denote the independent variable values as xi and the dependent
ones as yi.
• Step 2: Calculate the average values of xi and yi as X and Y.
• Step 3: Presume the equation of the line of best fit as y = mx + c, where m is
the slope of the line and c represents the intercept of the line on the Y-axis.
• Step 4: The slope m can be calculated from the following formula:
m = [ (X – xi)×(Y – yi)] / (X – xi)2
Σ Σ
• Step 5: The intercept c is calculated from the following formula:
c = Y – mX
• Thus, we obtain the line of best fit as y = mx + c, where values of m and c
can be calculated from the formulae defined above.
16
Examples
• Problem 1: Find the line of best fit for the following data points using
the Least Square method: (x,y) = (1,3), (2,4), (4,8), (6,10), (8,15).
• Problem 2: Find the line of best fit for the following data of heights and
weights of students of a school using the Least Square method:
Height (in centimeters): [160, 162, 164, 166, 168]
Weight (in kilograms): [52, 55, 57, 60, 61]
17
Multiple Linear Regression
• Multiple Regression is an extension of linear
regression, where we use multiple independent
variables to predict a dependent variable. It helps in
analyzing how several factors affect an outcome.
• The Multiple Regression Equation looks like this:
• Y=β0+β1X1+β2X2+β3X3+ε
18
Multiple Linear Regression
• Where:
• Y =Dependent Variable
• X₁, X₂, X₃ = Independent Variables
• β₀ = Intercept (constant)
• β₁, β₂, β₃ = Regression coefficients (impact of each
variable)
• ε = Error term (random noise)
19
Multiple Linear Regression
• How to Solve Multiple Regression? 📊
• To solve a Multiple Regression Problem, follow
these 5 key steps:
• 1 ️
1️⃣Collect Data
2️⃣Form the Regression Equation
3 ️
3️⃣Estimate Coefficients (β values)
4️⃣Check Model Accuracy
5 ️
5️⃣Make Predictions
20
Example - Predicting Student Scores
21
Collect Data
1️⃣ 📋
• We need historical data where we know the
dependent variable (Y) and independent variables
(X).
• Example: Predicting Student Scores 🎓
• We want to predict Final Exam Score (Y) based on:
• Study Hours (X₁)
• Sleep Hours (X₂)
• Past Grades (X₃)
22
Form the Regression Equation 2 ️
2️⃣
• The multiple regression equation is:
Where:
• Y = Final Score (Dependent Variable)
• X₁, X₂, X₃ = Study Hours, Sleep Hours, Past Grades (Independent
Variables)
• β₀ = Intercept (constant)
• β₁, β₂, β₃ = Regression coefficients (impact of each variable)
• ε = Error term
23
Estimate the Coefficients (β) 🔢
3️⃣
• Step 1: Organize Data in Matrix Form
– We write the equation in matrix form for computation:
• Where:
• Y = Column vector of output values
• X = Matrix of independent variables (including a column of 1s for
β₀)
• β = Column vector of coefficients
• ε = Error term
24
Estimate the Coefficients (β) 🔢
3️ 3️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3️
For example, with 3 independent variables, the
equation:
•Can be written as:
25
Estimate the Coefficients (β) 🔢
3️ 3️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3️
• Step 2: Compute β Using the Least Squares Formula
– The best β values are found using the formula:
• Step 3: Solve for β Values
26
Estimate the Coefficients (β) 🔢
3️ 3️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3️
27
Estimate the Coefficients (β) 🔢
3️ 3️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3️
28
Estimate the Coefficients (β) 🔢
3️ 3️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3️
29
Estimate the Coefficients (β) 🔢
3️ 3️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3️
30
Estimate the Coefficients (β) 🔢
3️ 3️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3️
31
• Compute (Xᵀ X)⁻¹ (Inverse of Xᵀ X)
Estimate the Coefficients (β) 🔢
3️ 3️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3
️
3️
32
• Compute (Xᵀ X)⁻¹ (Inverse of Xᵀ X)
Check Model Accuracy (Goodness of Fit)
4️⃣ 📊
33
• Before making predictions, check if the model is accurate
using:
✅ R² (Coefficient of Determination) → Measures how well the
model explains the data (closer to 1 = better model).
✅ p-values → Check if independent variables significantly
impact Y (p < 0.05 is statistically significant).
✅ Multicollinearity Check → Ensure independent variables
aren't highly correlated (use VIF test).
Make Predictions 5 ️
5️⃣
34
• Now that we have the equation, we can predict for
new datas.
Example 2
35
• Now that we have the equation, we can predict for
new datas.
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning
Machine Learning Unit 2_Supervised Learning

More Related Content

PPTX
regression analysis presentation slides.
PDF
Lecture 4 - Linear Regression, a lecture in subject module Statistical & Mach...
PPTX
Lecture 8 Linear and Multiple Regression (1).pptx
PDF
Module 5.pdf Machine Learning Types and examples
PPTX
Unit 3 – AIML.pptx
PPTX
2a-linear-regression-18Maykjkij;oik;.pptx
PDF
Least Squares Regression Method | Edureka
PDF
Machine learning Introduction
regression analysis presentation slides.
Lecture 4 - Linear Regression, a lecture in subject module Statistical & Mach...
Lecture 8 Linear and Multiple Regression (1).pptx
Module 5.pdf Machine Learning Types and examples
Unit 3 – AIML.pptx
2a-linear-regression-18Maykjkij;oik;.pptx
Least Squares Regression Method | Edureka
Machine learning Introduction

Similar to Machine Learning Unit 2_Supervised Learning (20)

PPT
604_multiplee.ppt
PPTX
MSE.pptx
PDF
3ml.pdf
PPTX
Predictive Modelling
PDF
Lecture 5 - Linear Regression Linear Regression
PPTX
Curve Fitting
PDF
Measurement Techniques,Uncertainty Analysispdf
PPTX
Regression vs Neural Net
PDF
Module-2_ML.pdf
PPTX
11Polynomial RegressionPolynomial RegressionPolynomial RegressionPolynomial R...
PPTX
AI & ML(Unit III).pptx.It contains also syllabus
PDF
Machine learning meetup
PPTX
Data Science and Machine Learning with Tensorflow
PPTX
Linear Programing.pptx
PPTX
Supervised learning for IOT IN Vellore Institute of Technology
PDF
MachineLearning_Unit-II.FHDGFHJKpptx.pdf
PPTX
Different Types of Machine Learning Algorithms
PPTX
REGRESSION METasdfghjklmjhgftrHODS1.pptx
PPTX
numerical method chapter 5.pptxggggvvgggbg
PDF
Linear Regression
604_multiplee.ppt
MSE.pptx
3ml.pdf
Predictive Modelling
Lecture 5 - Linear Regression Linear Regression
Curve Fitting
Measurement Techniques,Uncertainty Analysispdf
Regression vs Neural Net
Module-2_ML.pdf
11Polynomial RegressionPolynomial RegressionPolynomial RegressionPolynomial R...
AI & ML(Unit III).pptx.It contains also syllabus
Machine learning meetup
Data Science and Machine Learning with Tensorflow
Linear Programing.pptx
Supervised learning for IOT IN Vellore Institute of Technology
MachineLearning_Unit-II.FHDGFHJKpptx.pdf
Different Types of Machine Learning Algorithms
REGRESSION METasdfghjklmjhgftrHODS1.pptx
numerical method chapter 5.pptxggggvvgggbg
Linear Regression
Ad

More from ssuser08e250 (6)

PPTX
Benchmark evaluation, Stoplist Generation.pptx
PPTX
Circular Linked List in Data Structures Design
PPTX
Types of Sorting in Data Structures Design
PDF
DPCO-Unit 2-Combinational Circuit.pdf
PPT
Decisions.ppt
PDF
LEARN PROGRAMMING IN VIRTUAL REALITY_ A PROJECT FOR COMPUTER SCIE.pdf
Benchmark evaluation, Stoplist Generation.pptx
Circular Linked List in Data Structures Design
Types of Sorting in Data Structures Design
DPCO-Unit 2-Combinational Circuit.pdf
Decisions.ppt
LEARN PROGRAMMING IN VIRTUAL REALITY_ A PROJECT FOR COMPUTER SCIE.pdf
Ad

Recently uploaded (20)

PPTX
Construction Project Organization Group 2.pptx
PPTX
additive manufacturing of ss316l using mig welding
PPTX
CYBER-CRIMES AND SECURITY A guide to understanding
PDF
Mohammad Mahdi Farshadian CV - Prospective PhD Student 2026
PDF
PRIZ Academy - 9 Windows Thinking Where to Invest Today to Win Tomorrow.pdf
PDF
Embodied AI: Ushering in the Next Era of Intelligent Systems
PDF
composite construction of structures.pdf
PDF
Well-logging-methods_new................
PPTX
MCN 401 KTU-2019-PPE KITS-MODULE 2.pptx
PPT
Mechanical Engineering MATERIALS Selection
PPT
Project quality management in manufacturing
DOCX
573137875-Attendance-Management-System-original
PPTX
KTU 2019 -S7-MCN 401 MODULE 2-VINAY.pptx
PDF
TFEC-4-2020-Design-Guide-for-Timber-Roof-Trusses.pdf
PDF
SM_6th-Sem__Cse_Internet-of-Things.pdf IOT
PPTX
CARTOGRAPHY AND GEOINFORMATION VISUALIZATION chapter1 NPTE (2).pptx
PPTX
IOT PPTs Week 10 Lecture Material.pptx of NPTEL Smart Cities contd
PDF
keyrequirementskkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkk
DOCX
ASol_English-Language-Literature-Set-1-27-02-2023-converted.docx
PPTX
Engineering Ethics, Safety and Environment [Autosaved] (1).pptx
Construction Project Organization Group 2.pptx
additive manufacturing of ss316l using mig welding
CYBER-CRIMES AND SECURITY A guide to understanding
Mohammad Mahdi Farshadian CV - Prospective PhD Student 2026
PRIZ Academy - 9 Windows Thinking Where to Invest Today to Win Tomorrow.pdf
Embodied AI: Ushering in the Next Era of Intelligent Systems
composite construction of structures.pdf
Well-logging-methods_new................
MCN 401 KTU-2019-PPE KITS-MODULE 2.pptx
Mechanical Engineering MATERIALS Selection
Project quality management in manufacturing
573137875-Attendance-Management-System-original
KTU 2019 -S7-MCN 401 MODULE 2-VINAY.pptx
TFEC-4-2020-Design-Guide-for-Timber-Roof-Trusses.pdf
SM_6th-Sem__Cse_Internet-of-Things.pdf IOT
CARTOGRAPHY AND GEOINFORMATION VISUALIZATION chapter1 NPTE (2).pptx
IOT PPTs Week 10 Lecture Material.pptx of NPTEL Smart Cities contd
keyrequirementskkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkk
ASol_English-Language-Literature-Set-1-27-02-2023-converted.docx
Engineering Ethics, Safety and Environment [Autosaved] (1).pptx

Machine Learning Unit 2_Supervised Learning

  • 1. UNIT 2 SUPERVISED LEARNING [BME, CSE, ECE, EEE & Mechanical] Linear Regression Models: Least squares, single & multiple variables, Bayesian linear regression, gradient descent, Linear Classification Models: Discriminant function – Perceptron algorithm, Probabilistic discriminative model - Logistic regression, Probabilistic generative model – Naïve Bayes, Maximum margin classifier – Support vector machine, Decision Tree, Random Forests. 1
  • 2. Supervised Learning • Supervised machine learning is a fundamental approach for machine learning and artificial intelligence. • It involves training a model using labeled data, where each input comes with a corresponding correct output.
  • 4. Steps involved in Supervised Learning • First Determine the type of training dataset • Collect/Gather the labelled training data. • Split the training dataset into training dataset, test dataset, and validation dataset. • Determine the input features of the training dataset, which should have enough knowledge so that the model can accurately predict the output. • Determine the suitable algorithm for the model, such as support vector machine, decision tree, etc. 4
  • 5. Steps involved in Supervised Learning • Execute the algorithm on the training dataset. Sometimes we need validation sets as the control parameters, which are the subset of training datasets. • Evaluate the accuracy of the model by providing the test set. If the model predicts the correct output, which means our model is accurate. 5
  • 7. Classification • Classification algorithms are used when the output variable is categorical, which means there are two classes such as Yes-No, Male-Female, True-false, etc. • Below are some popular Classification algorithms which come under supervised learning: – Random Forest – Decision Trees – Logistic Regression – Support vector Machines 7
  • 8. Regression • Regression algorithms are used if there is a relationship between the input variable and the output variable. • It is used for the prediction of continuous variables, such as Weather forecasting, Market Trends, etc. • Below are some popular Regression algorithms which come under supervised learning: – Linear Regression – Regression Trees – Non-Linear Regression – Bayesian Linear Regression – Polynomial Regression 8
  • 9. Linear Regression • Linear regression is a statistical regression method which is used for predictive analysis. • It is one of the very simple and easy algorithms which works on regression and shows the relationship between the continuous variables. • It is used for solving the regression problem in machine learning. • Linear regression shows the linear relationship between the independent variable (X-axis) and the dependent variable (Y-axis), hence called linear regression. 9
  • 10. Linear Regression-Contd • If there is only one input variable (x), then such linear regression is called simple linear regression. • If there is more than one input variable, then such linear regression is called multiple linear regression. • The relationship between variables in the linear regression model can be explained using the below image. 10
  • 11. Linear Regression-Contd • Here we are predicting the salary of an employee on the basis of the year of experience. 11
  • 12. Least Square Method • Least Squares method is a statistical technique used to find the equation of best-fitting curve or line to a set of data points by minimizing the sum of the squared differences between the observed values and the values predicted by the model. • This method aims at minimizing the sum of squares of deviations as much as possible. The line obtained from such a method is called a regression line or line of best fit. 12
  • 13. Linear Regression-Contd • Below is the mathematical equation for Linear regression: Y= aX+b Here, Y= dependent variables (target variables) , X= Independent variables (predictor variables) , a and b are the linear coefficients. 13
  • 14. Formula for Least Square Method • Least Square Method formula is used to find the best-fitting line through a set of data points. • For a simple linear regression, which is a line of the form y=mx+c, Where, y is the dependent variable, x is the independent variable, a is the slope of the line, and b is the y-intercept, 14
  • 15. Formula for Least Square Method • Formulas to calculate the slope (m) and intercept (c) of the line are derived from the following equations: • Slope (m) : m = n(∑xy)−(∑x)(∑y) / n(∑x2)−(∑x)2​ • Intercept (c) : c = (∑y)−a(∑x) / n​ Where: • n is the number of data points, • ∑xy is the sum of the product of each pair of x and y values, • ∑x is the sum of all x values, • ∑y is the sum of all y values, • ∑x2 is the sum of the squares of x values. 15
  • 16. Steps for Least Square Method • Step 1: Denote the independent variable values as xi and the dependent ones as yi. • Step 2: Calculate the average values of xi and yi as X and Y. • Step 3: Presume the equation of the line of best fit as y = mx + c, where m is the slope of the line and c represents the intercept of the line on the Y-axis. • Step 4: The slope m can be calculated from the following formula: m = [ (X – xi)×(Y – yi)] / (X – xi)2 Σ Σ • Step 5: The intercept c is calculated from the following formula: c = Y – mX • Thus, we obtain the line of best fit as y = mx + c, where values of m and c can be calculated from the formulae defined above. 16
  • 17. Examples • Problem 1: Find the line of best fit for the following data points using the Least Square method: (x,y) = (1,3), (2,4), (4,8), (6,10), (8,15). • Problem 2: Find the line of best fit for the following data of heights and weights of students of a school using the Least Square method: Height (in centimeters): [160, 162, 164, 166, 168] Weight (in kilograms): [52, 55, 57, 60, 61] 17
  • 18. Multiple Linear Regression • Multiple Regression is an extension of linear regression, where we use multiple independent variables to predict a dependent variable. It helps in analyzing how several factors affect an outcome. • The Multiple Regression Equation looks like this: • Y=β0+β1X1+β2X2+β3X3+ε 18
  • 19. Multiple Linear Regression • Where: • Y =Dependent Variable • X₁, X₂, X₃ = Independent Variables • β₀ = Intercept (constant) • β₁, β₂, β₃ = Regression coefficients (impact of each variable) • ε = Error term (random noise) 19
  • 20. Multiple Linear Regression • How to Solve Multiple Regression? 📊 • To solve a Multiple Regression Problem, follow these 5 key steps: • 1 ️ 1️⃣Collect Data 2️⃣Form the Regression Equation 3 ️ 3️⃣Estimate Coefficients (β values) 4️⃣Check Model Accuracy 5 ️ 5️⃣Make Predictions 20
  • 21. Example - Predicting Student Scores 21
  • 22. Collect Data 1️⃣ 📋 • We need historical data where we know the dependent variable (Y) and independent variables (X). • Example: Predicting Student Scores 🎓 • We want to predict Final Exam Score (Y) based on: • Study Hours (X₁) • Sleep Hours (X₂) • Past Grades (X₃) 22
  • 23. Form the Regression Equation 2 ️ 2️⃣ • The multiple regression equation is: Where: • Y = Final Score (Dependent Variable) • X₁, X₂, X₃ = Study Hours, Sleep Hours, Past Grades (Independent Variables) • β₀ = Intercept (constant) • β₁, β₂, β₃ = Regression coefficients (impact of each variable) • ε = Error term 23
  • 24. Estimate the Coefficients (β) 🔢 3️⃣ • Step 1: Organize Data in Matrix Form – We write the equation in matrix form for computation: • Where: • Y = Column vector of output values • X = Matrix of independent variables (including a column of 1s for β₀) • β = Column vector of coefficients • ε = Error term 24
  • 25. Estimate the Coefficients (β) 🔢 3️ 3️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3️ For example, with 3 independent variables, the equation: •Can be written as: 25
  • 26. Estimate the Coefficients (β) 🔢 3️ 3️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3️ • Step 2: Compute β Using the Least Squares Formula – The best β values are found using the formula: • Step 3: Solve for β Values 26
  • 27. Estimate the Coefficients (β) 🔢 3️ 3️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3️ 27
  • 28. Estimate the Coefficients (β) 🔢 3️ 3️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3️ 28
  • 29. Estimate the Coefficients (β) 🔢 3️ 3️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3️ 29
  • 30. Estimate the Coefficients (β) 🔢 3️ 3️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3️ 30
  • 31. Estimate the Coefficients (β) 🔢 3️ 3️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3️ 31 • Compute (Xᵀ X)⁻¹ (Inverse of Xᵀ X)
  • 32. Estimate the Coefficients (β) 🔢 3️ 3️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3 ️ 3️ 32 • Compute (Xᵀ X)⁻¹ (Inverse of Xᵀ X)
  • 33. Check Model Accuracy (Goodness of Fit) 4️⃣ 📊 33 • Before making predictions, check if the model is accurate using: ✅ R² (Coefficient of Determination) → Measures how well the model explains the data (closer to 1 = better model). ✅ p-values → Check if independent variables significantly impact Y (p < 0.05 is statistically significant). ✅ Multicollinearity Check → Ensure independent variables aren't highly correlated (use VIF test).
  • 34. Make Predictions 5 ️ 5️⃣ 34 • Now that we have the equation, we can predict for new datas.
  • 35. Example 2 35 • Now that we have the equation, we can predict for new datas.
  • 51. 51
  • 52. 52
  • 53. 53
  • 54. 54
  • 55. 55
  • 56. 56
  • 57. 57
  • 58. 58
  • 59. 59
  • 60. 60
  • 61. 61
  • 62. 62
  • 63. 63
  • 64. 64
  • 65. 65
  • 66. 66