SlideShare a Scribd company logo
Support Vector Machine
➔ Most popular Supervised Learning algorithms.
➔ Used for both Classification and Regression tasks. (But widely used for
classification task)
➔ Works better for smaller size data.
➔ Works well with high dimensional space (independent variables)
➔ Works better where no. of features are greater than no. of obs.
➔ It doesn't perform well, when dataset has more noise i.e. target classes
are overlapping.
➔ It doesn't perform well, when we have large dataset because the required
training time is higher.
The goal of the support vector machine algorithm is to create the best
line or decision boundary in an n-dimensional space (n — the number
of features) that distinctly classifies the data points.
This best decision boundary is called a hyperplane.
Our objective is to find a plane that has the maximum margin, i.e
the maximum distance between data points of both classes.
Terminologies:
Margin: Margin is the perpendicular distance between
the closest data points and the hyperplane.
The best optimised line(hyperplane) with maximum
margin is termed as Margin Maximal Hyperplane.
The closest points where the margin distance is
calculated are considered as support Vectors.
➔ Support vectors are data points that are closer to the hyperplane
➔ Support vectors are influencers of the position and orientation of hyperplane
➔ With the help of support vectors, we maximise the margin of classifier.
➔ Support vectors are the points which help us to build the SVM classifier.
SVM_notes.pdf
Terminologies:
Regularization:
➔ ‘C’ parameter in python sklearn library
➔ Optimises SVM classifier to avoid misclassifying the data
C = large Margin of Hyperplane = small
C = small Margin of Hyperplane = large
Problems with setting C values:
C small = chances of underfitting
C large = chances of overfitting
C = large
Margin of Hyperplane = small
C = small
Margin of Hyperplane = large
Terminologies:
Gamma:
➔ Defines how far influences the calculation of line of
separation.
➔ Low gamma - points far from hyperplane are considered for
the calculation
➔ High gamma - points close to hyperplane are considered for
the calculation
SVM_notes.pdf
Outliers in the data can affect the threshold value and lead
to wrong predictions.
Terminologies:
Kernels:
➔ Kernel is the technique used
by SVM to classify the non-
linear data.
➔ Kernel functions are used to
increase the dimension of
the data, so that SVM can fit
the optimum hyperplane to
separate the data.
2D and 3D feature space
If the number of input features is 2, then the hyperplane is just a line.
If the number of input features is 3, then the hyperplane becomes a two-dimensional plane.
It becomes difficult to imagine when the number of features exceeds 3
Types of SVM
Non-Linear SVM
02
01
Linear SVM
Linear SVM is used for linearly separable data,
which means if a dataset can be classified into two
classes by using a single straight line
Non-Linear SVM is used for non-linearly
separated data.which means if a dataset
cannot be classified by using a straight line,
then such data is termed as non-linear data
and classifier used is called as Non-linear
SVM classifier.
Non-Linear SVM
SVM_notes.pdf

More Related Content

PPTX
Classification-Support Vector Machines.pptx
PPTX
PPTX
SVM FOR GRADE 11 pearson Btec 3rd level.ppt
PDF
SVM(support vector Machine)withExplanation.pdf
PPTX
SVM[Support vector Machine] Machine learning
PPTX
Support vector machine learning.pptx
PPTX
Support Vector Machines USING MACHINE LEARNING HOW IT WORKS
PPTX
support vector machine 1.pptx
Classification-Support Vector Machines.pptx
SVM FOR GRADE 11 pearson Btec 3rd level.ppt
SVM(support vector Machine)withExplanation.pdf
SVM[Support vector Machine] Machine learning
Support vector machine learning.pptx
Support Vector Machines USING MACHINE LEARNING HOW IT WORKS
support vector machine 1.pptx

Similar to SVM_notes.pdf (20)

PPTX
Support vector machine-SVM's
PPTX
Support vector machine
PPTX
Support vector machines
PPTX
Tariku Bokila SVMA Presentation.pptx ddd
PPTX
svm-proyekt.pptx
PPTX
classification algorithms in machine learning.pptx
PPTX
Support vector machine
PPTX
ML-Lec-17-SVM,sshwqw - Non-Linear (1).pptx
PPTX
SUPPORT VECTOR MACHINE ( SVM)akjhgaskjdgjksdgajkgdagdaakg[1].pptx
PPTX
Introduction to Machine Learning Elective Course
PPTX
Support Vector Machine topic of machine learning.pptx
PPTX
Support Vector Machine ppt presentation
PPTX
Statistical Machine Learning unit4 lecture notes
PDF
OM-DS-Fall2022-Session10-Support vector machine.pdf
PPTX
support vector machines ML Algorithmnppt.pptx
PDF
Data Science - Part IX - Support Vector Machine
PDF
Support Vector Machines ( SVM )
PPTX
Support vector machine
PDF
Machine Learning-Lec8 support vector machine.pdf
PDF
Support vector machine, machine learning
Support vector machine-SVM's
Support vector machine
Support vector machines
Tariku Bokila SVMA Presentation.pptx ddd
svm-proyekt.pptx
classification algorithms in machine learning.pptx
Support vector machine
ML-Lec-17-SVM,sshwqw - Non-Linear (1).pptx
SUPPORT VECTOR MACHINE ( SVM)akjhgaskjdgjksdgajkgdagdaakg[1].pptx
Introduction to Machine Learning Elective Course
Support Vector Machine topic of machine learning.pptx
Support Vector Machine ppt presentation
Statistical Machine Learning unit4 lecture notes
OM-DS-Fall2022-Session10-Support vector machine.pdf
support vector machines ML Algorithmnppt.pptx
Data Science - Part IX - Support Vector Machine
Support Vector Machines ( SVM )
Support vector machine
Machine Learning-Lec8 support vector machine.pdf
Support vector machine, machine learning

Recently uploaded (20)

PDF
Approach and Philosophy of On baking technology
PPTX
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
PDF
A comparative analysis of optical character recognition models for extracting...
PDF
Microsoft Solutions Partner Drive Digital Transformation with D365.pdf
PDF
gpt5_lecture_notes_comprehensive_20250812015547.pdf
PDF
Web App vs Mobile App What Should You Build First.pdf
PDF
From MVP to Full-Scale Product A Startup’s Software Journey.pdf
PDF
Accuracy of neural networks in brain wave diagnosis of schizophrenia
PPTX
OMC Textile Division Presentation 2021.pptx
PDF
DASA ADMISSION 2024_FirstRound_FirstRank_LastRank.pdf
PPTX
cloud_computing_Infrastucture_as_cloud_p
PDF
Mushroom cultivation and it's methods.pdf
PDF
project resource management chapter-09.pdf
PDF
Heart disease approach using modified random forest and particle swarm optimi...
PDF
Hindi spoken digit analysis for native and non-native speakers
PDF
1 - Historical Antecedents, Social Consideration.pdf
PDF
A comparative study of natural language inference in Swahili using monolingua...
PDF
Transform Your ITIL® 4 & ITSM Strategy with AI in 2025.pdf
PDF
Unlocking AI with Model Context Protocol (MCP)
PDF
Hybrid model detection and classification of lung cancer
Approach and Philosophy of On baking technology
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
A comparative analysis of optical character recognition models for extracting...
Microsoft Solutions Partner Drive Digital Transformation with D365.pdf
gpt5_lecture_notes_comprehensive_20250812015547.pdf
Web App vs Mobile App What Should You Build First.pdf
From MVP to Full-Scale Product A Startup’s Software Journey.pdf
Accuracy of neural networks in brain wave diagnosis of schizophrenia
OMC Textile Division Presentation 2021.pptx
DASA ADMISSION 2024_FirstRound_FirstRank_LastRank.pdf
cloud_computing_Infrastucture_as_cloud_p
Mushroom cultivation and it's methods.pdf
project resource management chapter-09.pdf
Heart disease approach using modified random forest and particle swarm optimi...
Hindi spoken digit analysis for native and non-native speakers
1 - Historical Antecedents, Social Consideration.pdf
A comparative study of natural language inference in Swahili using monolingua...
Transform Your ITIL® 4 & ITSM Strategy with AI in 2025.pdf
Unlocking AI with Model Context Protocol (MCP)
Hybrid model detection and classification of lung cancer

SVM_notes.pdf

  • 2. ➔ Most popular Supervised Learning algorithms. ➔ Used for both Classification and Regression tasks. (But widely used for classification task) ➔ Works better for smaller size data. ➔ Works well with high dimensional space (independent variables) ➔ Works better where no. of features are greater than no. of obs. ➔ It doesn't perform well, when dataset has more noise i.e. target classes are overlapping. ➔ It doesn't perform well, when we have large dataset because the required training time is higher.
  • 3. The goal of the support vector machine algorithm is to create the best line or decision boundary in an n-dimensional space (n — the number of features) that distinctly classifies the data points. This best decision boundary is called a hyperplane.
  • 4. Our objective is to find a plane that has the maximum margin, i.e the maximum distance between data points of both classes.
  • 5. Terminologies: Margin: Margin is the perpendicular distance between the closest data points and the hyperplane. The best optimised line(hyperplane) with maximum margin is termed as Margin Maximal Hyperplane. The closest points where the margin distance is calculated are considered as support Vectors.
  • 6. ➔ Support vectors are data points that are closer to the hyperplane ➔ Support vectors are influencers of the position and orientation of hyperplane ➔ With the help of support vectors, we maximise the margin of classifier. ➔ Support vectors are the points which help us to build the SVM classifier.
  • 8. Terminologies: Regularization: ➔ ‘C’ parameter in python sklearn library ➔ Optimises SVM classifier to avoid misclassifying the data C = large Margin of Hyperplane = small C = small Margin of Hyperplane = large Problems with setting C values: C small = chances of underfitting C large = chances of overfitting
  • 9. C = large Margin of Hyperplane = small C = small Margin of Hyperplane = large
  • 10. Terminologies: Gamma: ➔ Defines how far influences the calculation of line of separation. ➔ Low gamma - points far from hyperplane are considered for the calculation ➔ High gamma - points close to hyperplane are considered for the calculation
  • 12. Outliers in the data can affect the threshold value and lead to wrong predictions.
  • 13. Terminologies: Kernels: ➔ Kernel is the technique used by SVM to classify the non- linear data. ➔ Kernel functions are used to increase the dimension of the data, so that SVM can fit the optimum hyperplane to separate the data.
  • 14. 2D and 3D feature space If the number of input features is 2, then the hyperplane is just a line. If the number of input features is 3, then the hyperplane becomes a two-dimensional plane. It becomes difficult to imagine when the number of features exceeds 3
  • 15. Types of SVM Non-Linear SVM 02 01 Linear SVM Linear SVM is used for linearly separable data, which means if a dataset can be classified into two classes by using a single straight line Non-Linear SVM is used for non-linearly separated data.which means if a dataset cannot be classified by using a straight line, then such data is termed as non-linear data and classifier used is called as Non-linear SVM classifier.