SlideShare a Scribd company logo
Machine
Learning
Engr.Muhammad Suleman Memon
Assistant Professor & HoD
Department of Information Technology
Email: engrsuleman14@gmail.com
Cell#: 923332941065
What is Machine Learning
Machine learning is a segment of artificial intelligence.
It is designed to make computers learn by themselves and perform operations without
human intervention.
It means a computer or a system designed with machine learning will identify, analyse
and change accordingly and give the expected output when it comes across a new
pattern of data, without any need of humans.
How Machine
Learning Works?
Examples of Machine
Learning
Machine Learning – Methods
SUPERVISED
LEARNING
UNSUPERVISED
LEARNING
SEMI-SUPERVISED
LEARNING
REINFORCEMENT
LEARNING
Machine Learning.pdf
Machine Learning – Algorithms
Neural
networks
Decision trees
Random
forests
Support vector
machines
Nearest-
neighbor
mapping
k-means
clustering
Self-organizing
maps
Expectation
maximization
Bayesian
networks
Kernel density
estimation
Principal
component
analysis
Singular value
decomposition
Machine Learning
Tools and Libraries
Which Industries Use
Machine Learning?
▪ Pharmaceuticals
▪ Banks and Financial Services
▪ Health Care and Treatments
▪ Online Sales
▪ Mining, Oil and Gas
▪ Government Schemes
Linear Regression
in Machine
Learning
Linear Regression is an algorithm that belongs to
supervised Machine Learning.
It tries to apply relations that will predict the
outcome of an event based on the independent
variable data points.
The relation is usually a straight line that best
fits the different data points as close as possible.
The output is of a continuous form, i.e., numerical
value. For example, the output could be revenue
or sales in currency, the number of products sold,
etc.
Linear Regression
Equation
Linear regression can be
expressed mathematically as:
y= β0+ β 1x+ ε
Y= Dependent Variable
X= Independent Variable
β 0= intercept of the line
β1 = Linear regression
coefficient (slope of the line)
ε = random error
Types of Linear Regression
Simple
Linear
Regression
Multiple
Linear
Regression
Non-Linear
Regression
Simple Linear Regression
A simple straight-line
equation involving slope
(dy/dx) and intercept (an
integer/continuous value)
is utilized in simple Linear
Regression.
y=mx+c where y denotes
the output, x is the
independent variable, and
c is the intercept when
x=0.
Multiple Linear
Regression
▪ When a number
of independent variables are
more than one, the governing
linear equation applicable to
regression takes a different
form like:
▪ y=
c+m1x1+m2x2… mnxn where
represents the coefficient
responsible for impact of
different independent
variables x1, x2 etc.
Non-Linear Regression
▪ When the best fitting line is not a straight line but a curve, it is referred to
as Non-Linear Regression.
Advantages of Linear
Regression
For linear datasets, Linear Regression performs well to find the nature of the relationship
among different variables.
Linear Regression algorithms are easy to train and the Linear Regression models are easy
to implement.
Although, the Linear Regression models are likely to over-fit, but can be avoided using
dimensionality reduction techniques such as regularization (L1 and L2) and cross-
validation.
Disadvantages of Linear Regression
An important disadvantage of Linear
Regression is that it assumes
linearity between the dependent and
independent variables, which is rarely
represented in real-world data. It
assumes a straight-line relationship
between the dependent and
independent variables, which is
unlikely many times.
It is prone to noise and overfitting. In
datasets where the number of
observations is lesser than the
attributes, Linear Regression might
not be a good choice as it can lead
to overfitting. This is because the
algorithm can start considering the
noise while building the model.
Key Benefits of Linear Regression
Easy to Implement
Scalability
Interpretability
Applicability in real-time
Use Cases of Linear
Regression
➢ Agriculture
➢ Banking
➢ Finance
➢ Education
➢ Marketing
Agriculture
▪ Can be used to predict the amount of rainfall and crop yield.
Banking
▪ it is implemented to predict probability of loan defaults.
Finance sector
▪ used to predict stock prices and assess associated risks.
Healthcare Sector
▪ Helpful in modelling the healthcare costs, predicting the length of stay in
hospitals for patients.
Sports analytics
▪ Can be used to predict the performance of players in upcoming games.
Similarly
Education
▪ Can be used in education to predict student performances
in different courses.
Business
▪ To forecast product demands, predict product sales, decide on marketing
and advertising strategies, and so on.
Best Practices for Linear
Regression
1. Follow the Assumptions
2. Start with a Simple Model First
3. Use Visualizations
4. Start with Sample Dataset
5. Shifting to Multi-Linear Regression
6. Applying Linear Regression Model to Real-life Problems
7. Choosing Appropriate Data
Frequently Asked Questions (FAQs)
1. What is the output
of Linear Regression
in machine learning?
2. What are the
benefits of using
Linear Regression?
3. How do you
explain a Linear
Regression model?
4. Which type of
dataset is used for
Linear Regression?
5. Which ML model is
best for regression?
Logistic
Regression
Logistic Regression is a popular statistical model used for binary
classification, that is for predictions of the type this or that, yes or no, A
or B, etc.
Logistic regression can, however, be used for multiclass classification.
0: negative class
1: positive class
•Some examples of classification are mentioned below:
Email: spam / not spam
Online transactions: fraudulent / not fraudulent
Tumor: malignant / not malignant
Logistic Regression
Hypothesis
How does Logistic
Regression work?
➢ Logistic Regression uses a more complex cost function than Linear
Regression, this cost function is called the ‘Sigmoid function’ or also
known as the ‘logistic function’ instead of a linear function.
➢ The hypothesis of logistic regression tends to limit the cost function
between 0 and 1. Therefore linear functions fail to represent it as it can
have a value greater than 1 or less than 0 which is not possible as per
the hypothesis of logistic regression.
How does Logistic
Regression work?
Decision Boundary
The prediction function returns a probability score between 0 and 1. If you want to map the
discrete class (true/false, yes/no), you will have to select a threshold value above which you will
be classifying values into class 1 and below the threshold value into class 2.
p≥0.5,class=1 p<0.5,class=0
For example, suppose the threshold value is 0.5 and your prediction function returns 0.7, it will
be classified as positive.
If your predicted value is 0.2, which is less than the threshold value, it will be classified as
negative.
Decision Boundary
Linear vs Logistic
Regression
Linear Regression Logistic Regression
Outcome
In linear regression, the outcome
(dependent variable) is continuous. It can
have any one of an infinite number of
possible values.
In logistic regression, the outcome
(dependent variable) has only a limited
number of possible values.
The dependent variable
Linear regression is used when your response
variable is continuous. For instance, weight,
height, number of hours, etc.
Logistic regression is used when the response
variable is categorical in nature. For instance,
yes/no, true/false, red/green/blue, 1st/2nd/3rd/4th,
etc.
Linear vs Logistic
Regression
The independent
variable
In Linear Regression, the
independent variables can be
correlated with each other.
In logistic Regression, the
independent variables should
not be correlated with each
other. (no multi-collinearity)
Equation
Linear regression gives an
equation which is of the form Y
= mX + C, means equation with
degree 1.
Logistic regression gives an
equation which is of the form Y
= eX + e-X.
Linear vs Logistic
Regression
ficient interpretation
In linear regression, the coefficient interpretation
of independent variables are quite
straightforward (i.e. holding all other variables
constant, with a unit increase in this variable,
the dependent variable is expected to
increase/decrease by xxx).
In logistic regression, depends on the
family (binomial, Poisson, etc.) and link
(log, logit, inverse-log, etc.) you use, the
interpretation is different.
Error minimization
technique
Linear regression uses ordinary least squares
method to minimise the errors and arrive at a
best possible fit, while logistic regression uses
maximum likelihood method to arrive at the
solution.
Logistic regression is just the opposite.
Using the logistic loss function causes
large errors to be penalized to an
asymptotic constant.
Linear vs Logistic
Regression

More Related Content

PDF
3ml.pdf
PPTX
Logistic Regression | Logistic Regression In Python | Machine Learning Algori...
PDF
Introduction to machine learning
PPTX
UNIT II SUPERVISED LEARNING - Introduction
PPTX
Ml ppt at
PPTX
Supervised Machine Learning Algorithms
PPTX
Logistic regression is a data analysis technique that uses mathematics to fin...
PPTX
Logistic regression is a data analysis technique that uses mathematics to fin...
3ml.pdf
Logistic Regression | Logistic Regression In Python | Machine Learning Algori...
Introduction to machine learning
UNIT II SUPERVISED LEARNING - Introduction
Ml ppt at
Supervised Machine Learning Algorithms
Logistic regression is a data analysis technique that uses mathematics to fin...
Logistic regression is a data analysis technique that uses mathematics to fin...

Similar to Machine Learning.pdf (20)

PPTX
UNIT 4 E Introduction to linear model.pptx
PDF
Unit2_Linear Regression_Performance Metrics.pdf
PDF
Module 5.pdf Machine Learning Types and examples
PDF
Module -6.pdf Machine Learning Types and examples
PPTX
Logistic Regression in machine learning ppt
PPTX
Linear Regression final-1.pptx thbejnnej
PPTX
Machine Learning Tutorial Part - 1 | Machine Learning Tutorial For Beginners ...
PPTX
Unit 3 – AIML.pptx
PPTX
Supervised learning - Linear and Logistic Regression( AI, ML)
PDF
Supervised Learning.pdf
PDF
Classification Techniques for Machine Learning
PDF
Linear Regression vs Logistic Regression | Edureka
PPTX
Mastering Machine Learning Algorith.pptx
PDF
Linear Regression
PPTX
Regression
PDF
Machine learning Introduction
PPT
Machine Learning
PPTX
lec+5+_part+1 cloud .pptx
PPTX
Machine_Learning.pptx
PPTX
AI & ML(Unit III).pptx.It contains also syllabus
UNIT 4 E Introduction to linear model.pptx
Unit2_Linear Regression_Performance Metrics.pdf
Module 5.pdf Machine Learning Types and examples
Module -6.pdf Machine Learning Types and examples
Logistic Regression in machine learning ppt
Linear Regression final-1.pptx thbejnnej
Machine Learning Tutorial Part - 1 | Machine Learning Tutorial For Beginners ...
Unit 3 – AIML.pptx
Supervised learning - Linear and Logistic Regression( AI, ML)
Supervised Learning.pdf
Classification Techniques for Machine Learning
Linear Regression vs Logistic Regression | Edureka
Mastering Machine Learning Algorith.pptx
Linear Regression
Regression
Machine learning Introduction
Machine Learning
lec+5+_part+1 cloud .pptx
Machine_Learning.pptx
AI & ML(Unit III).pptx.It contains also syllabus
Ad

More from University of Sindh (13)

PPT
Python: Introduction to Functions: A complete guide for beginners
PDF
Introduction to Clustering: A complete guide
PPT
Introduction to the descriptive statistics
PPT
C plus plus Inheritance a complete guide
PPT
Introduction to Inheritance in C plus plus
PDF
Object Oriented Programming using C Plus Plus
PDF
Object Oriented Programming using C plus plus
PPTX
Introduction to Edges Detection Techniques
PPTX
Introduction to Data Science and Data Analysis
PPT
digitalimagefundamentals.ppt
PDF
Histogram Equalization.pdf
PDF
Introduction to Data Science.pdf
PPTX
Data Science
Python: Introduction to Functions: A complete guide for beginners
Introduction to Clustering: A complete guide
Introduction to the descriptive statistics
C plus plus Inheritance a complete guide
Introduction to Inheritance in C plus plus
Object Oriented Programming using C Plus Plus
Object Oriented Programming using C plus plus
Introduction to Edges Detection Techniques
Introduction to Data Science and Data Analysis
digitalimagefundamentals.ppt
Histogram Equalization.pdf
Introduction to Data Science.pdf
Data Science
Ad

Recently uploaded (20)

PDF
Review of recent advances in non-invasive hemoglobin estimation
PPTX
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx
PDF
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
PDF
Unlocking AI with Model Context Protocol (MCP)
PDF
Advanced methodologies resolving dimensionality complications for autism neur...
PDF
Modernizing your data center with Dell and AMD
PDF
Bridging biosciences and deep learning for revolutionary discoveries: a compr...
PDF
Mobile App Security Testing_ A Comprehensive Guide.pdf
PDF
Network Security Unit 5.pdf for BCA BBA.
PDF
NewMind AI Weekly Chronicles - August'25 Week I
PPTX
20250228 LYD VKU AI Blended-Learning.pptx
PDF
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
PPTX
Understanding_Digital_Forensics_Presentation.pptx
PDF
solutions_manual_-_materials___processing_in_manufacturing__demargo_.pdf
PPT
Teaching material agriculture food technology
PPT
“AI and Expert System Decision Support & Business Intelligence Systems”
PPTX
Big Data Technologies - Introduction.pptx
PDF
GDG Cloud Iasi [PUBLIC] Florian Blaga - Unveiling the Evolution of Cybersecur...
PDF
Per capita expenditure prediction using model stacking based on satellite ima...
PDF
CIFDAQ's Market Insight: SEC Turns Pro Crypto
Review of recent advances in non-invasive hemoglobin estimation
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
Unlocking AI with Model Context Protocol (MCP)
Advanced methodologies resolving dimensionality complications for autism neur...
Modernizing your data center with Dell and AMD
Bridging biosciences and deep learning for revolutionary discoveries: a compr...
Mobile App Security Testing_ A Comprehensive Guide.pdf
Network Security Unit 5.pdf for BCA BBA.
NewMind AI Weekly Chronicles - August'25 Week I
20250228 LYD VKU AI Blended-Learning.pptx
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
Understanding_Digital_Forensics_Presentation.pptx
solutions_manual_-_materials___processing_in_manufacturing__demargo_.pdf
Teaching material agriculture food technology
“AI and Expert System Decision Support & Business Intelligence Systems”
Big Data Technologies - Introduction.pptx
GDG Cloud Iasi [PUBLIC] Florian Blaga - Unveiling the Evolution of Cybersecur...
Per capita expenditure prediction using model stacking based on satellite ima...
CIFDAQ's Market Insight: SEC Turns Pro Crypto

Machine Learning.pdf

  • 1. Machine Learning Engr.Muhammad Suleman Memon Assistant Professor & HoD Department of Information Technology Email: engrsuleman14@gmail.com Cell#: 923332941065
  • 2. What is Machine Learning Machine learning is a segment of artificial intelligence. It is designed to make computers learn by themselves and perform operations without human intervention. It means a computer or a system designed with machine learning will identify, analyse and change accordingly and give the expected output when it comes across a new pattern of data, without any need of humans.
  • 5. Machine Learning – Methods SUPERVISED LEARNING UNSUPERVISED LEARNING SEMI-SUPERVISED LEARNING REINFORCEMENT LEARNING
  • 7. Machine Learning – Algorithms Neural networks Decision trees Random forests Support vector machines Nearest- neighbor mapping k-means clustering Self-organizing maps Expectation maximization Bayesian networks Kernel density estimation Principal component analysis Singular value decomposition
  • 9. Which Industries Use Machine Learning? ▪ Pharmaceuticals ▪ Banks and Financial Services ▪ Health Care and Treatments ▪ Online Sales ▪ Mining, Oil and Gas ▪ Government Schemes
  • 10. Linear Regression in Machine Learning Linear Regression is an algorithm that belongs to supervised Machine Learning. It tries to apply relations that will predict the outcome of an event based on the independent variable data points. The relation is usually a straight line that best fits the different data points as close as possible. The output is of a continuous form, i.e., numerical value. For example, the output could be revenue or sales in currency, the number of products sold, etc.
  • 11. Linear Regression Equation Linear regression can be expressed mathematically as: y= β0+ β 1x+ ε Y= Dependent Variable X= Independent Variable β 0= intercept of the line β1 = Linear regression coefficient (slope of the line) ε = random error
  • 12. Types of Linear Regression Simple Linear Regression Multiple Linear Regression Non-Linear Regression
  • 13. Simple Linear Regression A simple straight-line equation involving slope (dy/dx) and intercept (an integer/continuous value) is utilized in simple Linear Regression. y=mx+c where y denotes the output, x is the independent variable, and c is the intercept when x=0.
  • 14. Multiple Linear Regression ▪ When a number of independent variables are more than one, the governing linear equation applicable to regression takes a different form like: ▪ y= c+m1x1+m2x2… mnxn where represents the coefficient responsible for impact of different independent variables x1, x2 etc.
  • 15. Non-Linear Regression ▪ When the best fitting line is not a straight line but a curve, it is referred to as Non-Linear Regression.
  • 16. Advantages of Linear Regression For linear datasets, Linear Regression performs well to find the nature of the relationship among different variables. Linear Regression algorithms are easy to train and the Linear Regression models are easy to implement. Although, the Linear Regression models are likely to over-fit, but can be avoided using dimensionality reduction techniques such as regularization (L1 and L2) and cross- validation.
  • 17. Disadvantages of Linear Regression An important disadvantage of Linear Regression is that it assumes linearity between the dependent and independent variables, which is rarely represented in real-world data. It assumes a straight-line relationship between the dependent and independent variables, which is unlikely many times. It is prone to noise and overfitting. In datasets where the number of observations is lesser than the attributes, Linear Regression might not be a good choice as it can lead to overfitting. This is because the algorithm can start considering the noise while building the model.
  • 18. Key Benefits of Linear Regression Easy to Implement Scalability Interpretability Applicability in real-time
  • 19. Use Cases of Linear Regression ➢ Agriculture ➢ Banking ➢ Finance ➢ Education ➢ Marketing
  • 20. Agriculture ▪ Can be used to predict the amount of rainfall and crop yield.
  • 21. Banking ▪ it is implemented to predict probability of loan defaults.
  • 22. Finance sector ▪ used to predict stock prices and assess associated risks.
  • 23. Healthcare Sector ▪ Helpful in modelling the healthcare costs, predicting the length of stay in hospitals for patients.
  • 24. Sports analytics ▪ Can be used to predict the performance of players in upcoming games. Similarly
  • 25. Education ▪ Can be used in education to predict student performances in different courses.
  • 26. Business ▪ To forecast product demands, predict product sales, decide on marketing and advertising strategies, and so on.
  • 27. Best Practices for Linear Regression 1. Follow the Assumptions 2. Start with a Simple Model First 3. Use Visualizations 4. Start with Sample Dataset 5. Shifting to Multi-Linear Regression 6. Applying Linear Regression Model to Real-life Problems 7. Choosing Appropriate Data
  • 28. Frequently Asked Questions (FAQs) 1. What is the output of Linear Regression in machine learning? 2. What are the benefits of using Linear Regression? 3. How do you explain a Linear Regression model? 4. Which type of dataset is used for Linear Regression? 5. Which ML model is best for regression?
  • 29. Logistic Regression Logistic Regression is a popular statistical model used for binary classification, that is for predictions of the type this or that, yes or no, A or B, etc. Logistic regression can, however, be used for multiclass classification. 0: negative class 1: positive class •Some examples of classification are mentioned below: Email: spam / not spam Online transactions: fraudulent / not fraudulent Tumor: malignant / not malignant
  • 31. How does Logistic Regression work? ➢ Logistic Regression uses a more complex cost function than Linear Regression, this cost function is called the ‘Sigmoid function’ or also known as the ‘logistic function’ instead of a linear function. ➢ The hypothesis of logistic regression tends to limit the cost function between 0 and 1. Therefore linear functions fail to represent it as it can have a value greater than 1 or less than 0 which is not possible as per the hypothesis of logistic regression.
  • 33. Decision Boundary The prediction function returns a probability score between 0 and 1. If you want to map the discrete class (true/false, yes/no), you will have to select a threshold value above which you will be classifying values into class 1 and below the threshold value into class 2. p≥0.5,class=1 p<0.5,class=0 For example, suppose the threshold value is 0.5 and your prediction function returns 0.7, it will be classified as positive. If your predicted value is 0.2, which is less than the threshold value, it will be classified as negative.
  • 35. Linear vs Logistic Regression Linear Regression Logistic Regression Outcome In linear regression, the outcome (dependent variable) is continuous. It can have any one of an infinite number of possible values. In logistic regression, the outcome (dependent variable) has only a limited number of possible values. The dependent variable Linear regression is used when your response variable is continuous. For instance, weight, height, number of hours, etc. Logistic regression is used when the response variable is categorical in nature. For instance, yes/no, true/false, red/green/blue, 1st/2nd/3rd/4th, etc.
  • 36. Linear vs Logistic Regression The independent variable In Linear Regression, the independent variables can be correlated with each other. In logistic Regression, the independent variables should not be correlated with each other. (no multi-collinearity) Equation Linear regression gives an equation which is of the form Y = mX + C, means equation with degree 1. Logistic regression gives an equation which is of the form Y = eX + e-X.
  • 37. Linear vs Logistic Regression ficient interpretation In linear regression, the coefficient interpretation of independent variables are quite straightforward (i.e. holding all other variables constant, with a unit increase in this variable, the dependent variable is expected to increase/decrease by xxx). In logistic regression, depends on the family (binomial, Poisson, etc.) and link (log, logit, inverse-log, etc.) you use, the interpretation is different. Error minimization technique Linear regression uses ordinary least squares method to minimise the errors and arrive at a best possible fit, while logistic regression uses maximum likelihood method to arrive at the solution. Logistic regression is just the opposite. Using the logistic loss function causes large errors to be penalized to an asymptotic constant.