SlideShare a Scribd company logo
6
Most read
7
Most read
12
Most read
MAHARSHI DAYANAND SARASWATI
UNIVERSITY , AJMER
TYPES OF REGRESSION ANALYSIS
SUBMITTED BY,
EKTA BAFNA
M.TECH 2nd SEM.
SUBMITTED TO,
SIR PRAMOD RATHORE
TABLE OF CONTENTS
 Introduction of regression Analysis
Terminologies
Linear Regression
Polynomial Regression
Support Vector Regression
Decision Tree Regression
Ridge Regression
Lasso Regression
Logistic Regression
REGRESSION ANALYSIS IS A FORM OF PREDICTIVE MODELLING TECHNIQUE WHICH
INVESTIGATES THE RELATIONSHIP BETWEEN A DEPENDENT (TARGET) AND INDEPENDENT
VARIABLE (S) (PREDICTOR).
THIS TECHNIQUE IS USED FOR FORECASTING, TIME SERIES MODELLING AND FINDING
THE CAUSAL EFFECT RELATIONSHIP BETWEEN THE VARIABLES.
REGRESSION IS A SUPERVISED LEARNING TECHNIQUE WHICH HELPS IN FINDING THE
CORRELATION BETWEEN VARIABLES AND ENABLES US TO PREDICT THE CONTINUOUS
OUTPUT VARIABLE BASED ON THE ONE OR MORE PREDICTOR VARIABLES.
DEPENDENT VARIABLE
INDEPENDENT VARIABLE
OUTLIERS
UNDERFITTING AND
OVERFITTING
Terminologies Related to the
Regression Analysis:
Regression analysis and its type
LINEAR REGRESSION :
• Linear regression is a statistical regression method which is used for predictive
analysis.
•If there is only one input variable (x), then such linear regression is called simple
linear regression. And if there is more than one input variable, then such linear
regression is called multiple linear regression.
mathematical equation for Linear
regression:
Y= aX+b
Here, Y = dependent variables (target
variables),
X= Independent variables (predictor
variables),
a and b are the linear coefficients
Polynomial Regression:
•Polynomial Regression is a type of regression which models the non-linear
dataset using a linear model.
•It is similar to multiple linear regression, but it fits a non-linear curve between the
value of x and corresponding conditional values of y.
•Suppose there is a dataset which consists of datapoints which are present in a non-
linear fashion, so for such case, linear regression will not best fit to those datapoints.
To cover such datapoints, we need Polynomial regression.
Polynomial regression equation
Y= b0+b1x+ b2x2+ b3x3+.....+ bnxn
Support Vector Regression:
Support Vector Machine is a supervised learning algorithm which can be used for
regression as well as classification problems. So if we use it for regression problems,
then it is termed as Support Vector Regression.
Below are some keywords which are used
in Support Vector Regression:
Kernel: It is a function used to map a lower-
dimensional data into higher dimensional
data.
Hyperplane: In general SVM, it is a
separation line between two classes, but in
SVR, it is a line which helps to predict the
continuous variables and cover most of the
datapoints.
Boundary line: Boundary lines are the two
lines apart from hyperplane, which creates a
margin for datapoints.
Support vectors: Support vectors are the
datapoints which are nearest to the
hyperplane and opposite class.
Decision Tree Regression:
•Decision Tree regression builds a tree-like structure in which each internal node
represents the "test" for an attribute, each branch represent the result of the test,
and each leaf node represents the final decision or result.
•Decision Tree is a supervised learning algorithm which can be used for solving both
classification and regression problems.
•It can solve problems for both categorical and numerical data
Random forest is one of the most powerful supervised
learning algorithms which is capable of performing regression
as well as classification tasks.
With the help of Random Forest regression, we can prevent
Overfitting in the model by creating random subsets of the
dataset.
g(x)= f0(x)+ f1(x)+ f2(x)+....
Ridge Regression:
•Ridge regression is one of the most robust versions of linear regression in which a
small amount of bias is introduced so that we can get better long term predictions.
•Ridge regression is a regularization technique, which is used to reduce the
complexity of the model. It is also called as L2 regularization.
The equation for ridge regression
will be:
A general linear or polynomial regression will fail if there is high collinearity
between the independent variables, so to solve such problems, Ridge
regression can be used.
Lasso Regression:
•Lasso regression is another regularization technique to reduce the complexity of
the model.
•It is similar to the Ridge Regression except that penalty term contains only the
absolute weights instead of a square of weights.
•Since it takes absolute values, hence, it can shrink the slope to 0, whereas Ridge
Regression can only shrink it near to 0.
•It is also called as L1 regularization. The equation for Lasso regression will be:
Logistic Regression
•Logistic regression is another supervised learning algorithm which is used to solve
the classification problems. In classification problems, we have dependent
variables in a binary or discrete format such as 0 or 1 , Yes or No, True or False.
•Logistic regression uses sigmoid function or logistic function which is a complex
cost function.
The function can be
represented as:
f(x)= Output between the
0 and 1 value.
x= input to the function
e= base of natural
logarithm.
Regression analysis and its type

More Related Content

PPTX
Regression analysis.
PPTX
Regression analysis
PPTX
Regression Analysis
PPT
oneway ANOVA.ppt
PPT
Standard error-Biostatistics
PDF
Simple & Multiple Regression Analysis
PPTX
Chi squared test
PPTX
Regression
Regression analysis.
Regression analysis
Regression Analysis
oneway ANOVA.ppt
Standard error-Biostatistics
Simple & Multiple Regression Analysis
Chi squared test
Regression

What's hot (20)

PPTX
STATISTICAL REGRESSION MODELS
PPTX
Correlation and Regression
PDF
Student's t test
PPTX
Standard error
PPTX
Analysis of variance (anova)
PPTX
Regression ppt
PPTX
{ANOVA} PPT-1.pptx
PPT
Correlation
PDF
Binomial,Poisson,Geometric,Normal distribution
PPTX
Skewness and kurtosis
PPTX
Errors and types
PPTX
PROBABILITY
PPTX
Regression and corelation (Biostatistics)
PPTX
Regression
PPTX
Graphs in Biostatistics
PPTX
Karl pearson's correlation
PPTX
Karl pearson's coefficient of correlation (1)
STATISTICAL REGRESSION MODELS
Correlation and Regression
Student's t test
Standard error
Analysis of variance (anova)
Regression ppt
{ANOVA} PPT-1.pptx
Correlation
Binomial,Poisson,Geometric,Normal distribution
Skewness and kurtosis
Errors and types
PROBABILITY
Regression and corelation (Biostatistics)
Regression
Graphs in Biostatistics
Karl pearson's correlation
Karl pearson's coefficient of correlation (1)
Ad

Similar to Regression analysis and its type (20)

PPTX
Machine Learning Unit 3 Semester 3 MSc IT Part 2 Mumbai University
PPTX
MF Presentation.pptx
PPTX
Regression ppt.pptx
PPTX
Ai saturdays presentation
PPTX
Machine Learning-Linear regression
PDF
The normal presentation about linear regression in machine learning
PDF
lab_linear_regression_hy539 (1)_221109_035050.pdf
PPTX
Ca-1 assignment Machine learning.ygygygpptx
PPTX
linearregression-1909240jhgg53948.pptx
PPTX
Regression , Types of Regression, Application of Regression, methods
PPSX
Lasso and ridge regression
PDF
Regression Analysis - Thiyagu
PDF
Machine Learning Interview Question and Answer
DOCX
NPTEL Machine Learning Week 2.docx
PDF
Correation, Linear Regression and Multilinear Regression using R software
PDF
Introduction to Artificial Intelligence_ Lec 7
PPTX
Research methodology Regression Modeling.pptx
DOCX
Correlation and regression in r
DOC
Introduction to Support Vector Machines
PPTX
Lec5(Polynomial Regression) & Support vector regression.pptx
Machine Learning Unit 3 Semester 3 MSc IT Part 2 Mumbai University
MF Presentation.pptx
Regression ppt.pptx
Ai saturdays presentation
Machine Learning-Linear regression
The normal presentation about linear regression in machine learning
lab_linear_regression_hy539 (1)_221109_035050.pdf
Ca-1 assignment Machine learning.ygygygpptx
linearregression-1909240jhgg53948.pptx
Regression , Types of Regression, Application of Regression, methods
Lasso and ridge regression
Regression Analysis - Thiyagu
Machine Learning Interview Question and Answer
NPTEL Machine Learning Week 2.docx
Correation, Linear Regression and Multilinear Regression using R software
Introduction to Artificial Intelligence_ Lec 7
Research methodology Regression Modeling.pptx
Correlation and regression in r
Introduction to Support Vector Machines
Lec5(Polynomial Regression) & Support vector regression.pptx
Ad

Recently uploaded (20)

PPTX
A Presentation on Artificial Intelligence
PPTX
Cloud computing and distributed systems.
PDF
Network Security Unit 5.pdf for BCA BBA.
PDF
Per capita expenditure prediction using model stacking based on satellite ima...
PPTX
Big Data Technologies - Introduction.pptx
PDF
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
PDF
Chapter 3 Spatial Domain Image Processing.pdf
PDF
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
PDF
NewMind AI Weekly Chronicles - August'25-Week II
PPTX
MYSQL Presentation for SQL database connectivity
PDF
Dropbox Q2 2025 Financial Results & Investor Presentation
PPTX
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
PDF
cuic standard and advanced reporting.pdf
PDF
gpt5_lecture_notes_comprehensive_20250812015547.pdf
PDF
Spectral efficient network and resource selection model in 5G networks
PDF
Unlocking AI with Model Context Protocol (MCP)
PDF
Electronic commerce courselecture one. Pdf
DOCX
The AUB Centre for AI in Media Proposal.docx
PDF
Diabetes mellitus diagnosis method based random forest with bat algorithm
PDF
Encapsulation_ Review paper, used for researhc scholars
A Presentation on Artificial Intelligence
Cloud computing and distributed systems.
Network Security Unit 5.pdf for BCA BBA.
Per capita expenditure prediction using model stacking based on satellite ima...
Big Data Technologies - Introduction.pptx
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
Chapter 3 Spatial Domain Image Processing.pdf
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
NewMind AI Weekly Chronicles - August'25-Week II
MYSQL Presentation for SQL database connectivity
Dropbox Q2 2025 Financial Results & Investor Presentation
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
cuic standard and advanced reporting.pdf
gpt5_lecture_notes_comprehensive_20250812015547.pdf
Spectral efficient network and resource selection model in 5G networks
Unlocking AI with Model Context Protocol (MCP)
Electronic commerce courselecture one. Pdf
The AUB Centre for AI in Media Proposal.docx
Diabetes mellitus diagnosis method based random forest with bat algorithm
Encapsulation_ Review paper, used for researhc scholars

Regression analysis and its type

  • 1. MAHARSHI DAYANAND SARASWATI UNIVERSITY , AJMER TYPES OF REGRESSION ANALYSIS SUBMITTED BY, EKTA BAFNA M.TECH 2nd SEM. SUBMITTED TO, SIR PRAMOD RATHORE
  • 2. TABLE OF CONTENTS  Introduction of regression Analysis Terminologies Linear Regression Polynomial Regression Support Vector Regression Decision Tree Regression Ridge Regression Lasso Regression Logistic Regression
  • 3. REGRESSION ANALYSIS IS A FORM OF PREDICTIVE MODELLING TECHNIQUE WHICH INVESTIGATES THE RELATIONSHIP BETWEEN A DEPENDENT (TARGET) AND INDEPENDENT VARIABLE (S) (PREDICTOR). THIS TECHNIQUE IS USED FOR FORECASTING, TIME SERIES MODELLING AND FINDING THE CAUSAL EFFECT RELATIONSHIP BETWEEN THE VARIABLES. REGRESSION IS A SUPERVISED LEARNING TECHNIQUE WHICH HELPS IN FINDING THE CORRELATION BETWEEN VARIABLES AND ENABLES US TO PREDICT THE CONTINUOUS OUTPUT VARIABLE BASED ON THE ONE OR MORE PREDICTOR VARIABLES.
  • 4. DEPENDENT VARIABLE INDEPENDENT VARIABLE OUTLIERS UNDERFITTING AND OVERFITTING Terminologies Related to the Regression Analysis:
  • 6. LINEAR REGRESSION : • Linear regression is a statistical regression method which is used for predictive analysis. •If there is only one input variable (x), then such linear regression is called simple linear regression. And if there is more than one input variable, then such linear regression is called multiple linear regression. mathematical equation for Linear regression: Y= aX+b Here, Y = dependent variables (target variables), X= Independent variables (predictor variables), a and b are the linear coefficients
  • 7. Polynomial Regression: •Polynomial Regression is a type of regression which models the non-linear dataset using a linear model. •It is similar to multiple linear regression, but it fits a non-linear curve between the value of x and corresponding conditional values of y. •Suppose there is a dataset which consists of datapoints which are present in a non- linear fashion, so for such case, linear regression will not best fit to those datapoints. To cover such datapoints, we need Polynomial regression. Polynomial regression equation Y= b0+b1x+ b2x2+ b3x3+.....+ bnxn
  • 8. Support Vector Regression: Support Vector Machine is a supervised learning algorithm which can be used for regression as well as classification problems. So if we use it for regression problems, then it is termed as Support Vector Regression. Below are some keywords which are used in Support Vector Regression: Kernel: It is a function used to map a lower- dimensional data into higher dimensional data. Hyperplane: In general SVM, it is a separation line between two classes, but in SVR, it is a line which helps to predict the continuous variables and cover most of the datapoints. Boundary line: Boundary lines are the two lines apart from hyperplane, which creates a margin for datapoints. Support vectors: Support vectors are the datapoints which are nearest to the hyperplane and opposite class.
  • 9. Decision Tree Regression: •Decision Tree regression builds a tree-like structure in which each internal node represents the "test" for an attribute, each branch represent the result of the test, and each leaf node represents the final decision or result. •Decision Tree is a supervised learning algorithm which can be used for solving both classification and regression problems. •It can solve problems for both categorical and numerical data Random forest is one of the most powerful supervised learning algorithms which is capable of performing regression as well as classification tasks. With the help of Random Forest regression, we can prevent Overfitting in the model by creating random subsets of the dataset. g(x)= f0(x)+ f1(x)+ f2(x)+....
  • 10. Ridge Regression: •Ridge regression is one of the most robust versions of linear regression in which a small amount of bias is introduced so that we can get better long term predictions. •Ridge regression is a regularization technique, which is used to reduce the complexity of the model. It is also called as L2 regularization. The equation for ridge regression will be: A general linear or polynomial regression will fail if there is high collinearity between the independent variables, so to solve such problems, Ridge regression can be used.
  • 11. Lasso Regression: •Lasso regression is another regularization technique to reduce the complexity of the model. •It is similar to the Ridge Regression except that penalty term contains only the absolute weights instead of a square of weights. •Since it takes absolute values, hence, it can shrink the slope to 0, whereas Ridge Regression can only shrink it near to 0. •It is also called as L1 regularization. The equation for Lasso regression will be:
  • 12. Logistic Regression •Logistic regression is another supervised learning algorithm which is used to solve the classification problems. In classification problems, we have dependent variables in a binary or discrete format such as 0 or 1 , Yes or No, True or False. •Logistic regression uses sigmoid function or logistic function which is a complex cost function. The function can be represented as: f(x)= Output between the 0 and 1 value. x= input to the function e= base of natural logarithm.