SlideShare a Scribd company logo
EC-452
Machine
Learning
Fall 2023
COURSE INFORMATION
Course Number and Title: EC-452 Machine Learning
Credits: 3-0
Instructor(s)-in-charge: Dr. Ahmad Rauf Subhani (Assistant Prof)
Course type: Lecture
Required or Elective: Elective
Course pre-requisites Math-361 Probability and Statistics (Preferred)
Degree and Semester DE-42 (Electrical), Semester 7
Month and Year Fall 2023
Assessment
Course Assessment
Exam: 1 Midterm and 1 Final Examination
Assignment: -------
Quiz: 6 Quizes
Grading: Quiz: 10-15%
Assignments: 5-10%
Mid Semester Exam: 30-35%
Project 0-10%
End Semester Exam: 40-50%
Topics covered in the Course
Introduction to Machine Learning
• Machine learning is the field of study that gives computers the ability
to learn without being explicitly programmed. — Arthur L. Samuel, AI
pioneer, 1959
• A breakthrough in machine learning would be worth ten Microsofts.
— Bill Gates, Microsoft Co-Founder
Introduction to Machine Learning
Introduction to Machine Learning
Introduction to Machine Learning
• Machine Learning
• Deep Learning
• Artificial Intelligence
Introduction to Machine Learning
Introduction to Machine Learning
• Machine Learning is a tool.
• Like any other tool, it is important to read and understand its user manual.
• What are some other daily life tools?
• Do we need a user manual of a pen or a tyre???
Introduction to Machine Learning
• Do we need a user manual of a pen or a tyre???
Week_1 Machine Learning introduction.pptx
Applications of Machine Learning
• Email spam detection
• Face detection and matching (e.g., iPhone
X)
• Web search (e.g., DuckDuckGo, Bing,
Google)
• Sports predictions
• Post office (e.g., sorting letters by zip
codes)
• ATMs (e.g., reading checks)
• Credit card fraud
• Drug design
• Medical diagnoses
• Smart assistants (Apple Siri, Amazon
Alexa, . . . )
• Product recommendations (e.g., Netflix,
Amazon)
• Self-driving cars (e.g., Uber, Tesla)
• Language translation (Google translate)
• Sentiment analysis
• Chat GPT and Google Bard
• The list goes on…
Exercise
• While we proceed in the class, it is a good exercise to think about how
machine learning could be applied in these problem areas or tasks listed
above:
What is the desired outcome?
What could the dataset look like?
Is this a supervised or unsupervised problem, and what algorithms would you use?
(Something to revisit later in this semester.)
How would you measure success?
What are potential challenges or pitfalls?
Common Understanding
• Feature:
• A measurable property of the object (data) you're
trying to analyze.
• In datasets, features appear as columns
• Feature variable, attribute, measurements,
dimension
• Examples/ Samples:
• Entries in features columns
• In datasets, examples/samples, instances,
observations appear as row
• Target, synonymous to
• outcome, ground truth, response variable, dependent
variable, (class) label (in classification)
• Output / Prediction
• Use this to distinguish from targets; here, means
output from the model
Common
Understanding
• Classification
• A process of categorizing a given set of data
(feature or example?) into classes.
• The classes are often referred to as target, label
or categories.
• Regression
• A technique for investigating the relationship
between independent variables or features and
a dependent variable or outcome. It's used as a
method for predictive modelling in machine
learning, in which an algorithm is used to
predict continuous outcomes.
y
x
x2
x1
Categories of Machine Learning
Supervised Learning
Unsupervised Learning
Reinforcement Learning
 Labelled data
 Direct feedback
 Predict outcome/future
 No labels/target
 No feedback
 Find hidden structure in data
 Decision process
 Reward system
 Learn series of actions
Source: Raschka & Mirjalili: Python Machine Learning, 2nd Ed.
Supervised Learning Workflow
Machine
Learning
Algorithm
New
Data
Predictive
Model
Predictio
n
Label
s
Training Data
Supervised
Learning
• Learning from labeled
training data
• Inputs that also
contain the desired
outputs or targets;
basically, “examples”
of what we want to
predict.
x2
x1
Illustration of a binary classification problem (plus,
minus) and two feature variable (x1 and x2).
(Source: Raschka & Mirjalili: Python Machine
Learning, 2nd Ed.)
Supervised
Learning
y
x
Illustration of a linear regression model with one feature
(predictor) variable (x1) and the target (response) variable y.
The dashed-line indicates the functional form of the linear
regression model. (Source: Raschka & Mirjalili: Python
Machine Learning, 2nd Ed.)
Unsupervised
learning
• Unsupervised learning is concerned
with unlabelled data
• Common tasks in unsupervised
learning are clustering analysis
(assigning group memberships)
and dimensionality reduction
(compressing data onto a lower-
dimensional subspace or
manifold)
x2
x1
Illustration of clustering, dashed lines indicate
potential group membership assignments of
unlabeled data points.
(Source: Raschka & Mirjalili: Python Machine
Learning, 2nd Ed.)
Unsupervised learning
• Dimensionality reduction
Reinforcement learning
• The process of learning from rewards
while performing a series of actions
• We do not tell the learner, for example, a
(ro)bot, which action to take
• But merely assign a reward to each
action and/or the overall outcome.
• Instead of having “correct/false” label
for each step, the learner must discover
or learn a behavior that maximizes the
reward for a series of actions.
• Not a supervised setting and somewhat
related to unsupervised learning
Illustration of reinforcement learning
(Source: Raschka & Mirjalili: Python Machine Learning,
2nd Ed.)
Common Understanding (Jargons)
• Feature:
• A measurable property of the object (data) you're trying to analyze.
• In datasets, features appear as columns
• Predictor, variable, independent variable, input, attribute, covariate
• Examples/ Samples (of training and testing):
• Entries in features columns
• In datasets, examples/samples appear as row
• Observation, training record, training instance, training sample (in some contexts, sample
refers to a collection of training examples)
• Target, synonymous to
• outcome, ground truth, output, response variable, dependent variable, (class) label (in classification)
• Output / Prediction, use this to distinguish from targets; here, means output from the
model
Common Understanding (Jargons)
• Identify features and examples in the following data?
Common Understanding (Jargons)
• Supervised learning:
• Learn function to map input x (features) to output y (targets)
• Structured data:
• Databases, spreadsheets/csv files
• Unstructured data:
• Features like image pixels, audio signals, text sentences (before
DL, extensive feature engineering required)
Common Understanding (Jargons)
• Unstructured data
Supervised Learning
A Roadmap for Building Machine Learning
Systems
Label
s
Ra
w
Dat
a
Training
Dataset
Test
Dataset
Label
s
Learnin
g
Algorith
m
Preprocessi
ng
Learnin
g
Evaluatio
n
New Data
Labels
Predic
tion
Final
Model
Feature Extraction and
Scaling
Feature
Selecti
on i
Dimensionality
Reduction Sampling
Model Selection
Cross-Validation
Performance
Metrics
Hyperparameter
Optimization
Mostly not needed in DL
Supervised Learning (Notation)
Unknown function:
Hypothesis:
f(x) =y
h(x) =y
h :ℝm → ℝ
h :ℝm → 𝒴, 𝒴 ={1,...,k}
Classification Regression
Training set: 𝒟 ={⟨x[i],y[i]⟩,i = 1,…,m},
"training examples"
Data Representation
x =
x1
x2
⋮
xn
Feature vector
Cont
Feature vector
X =
x1
x2
⋮
xm
x =
x1
x2
⋮
xn
X =
x[1] x[1]
1 2
x[1]
n
x[2] x[2] x[2]
x[m] x[m]
1 2
⋯
1 2 ⋯ n
⋮ ⋮ ⋱ ⋮
⋯ x[m]
n
Design Matrix Design Matrix
[i]T
m =
n =
Data Representation (structured data)
33
Entire hypothesis space
Hypothesis space
a particular learning
algorithm category
has access to
Hypothesis space
a particular learning
algorithm can sample
Particular hypothesis
(i.e., a model/classifier)
Hypothesis Space
Classes of Machine Learning Algorithms
Below are some classes of algorithms that we are going to discuss in
this class:
• Generalized linear models (e.g., logistic regression)
• Support vector machines (e.g., linear SVM, RBF-kernel SVM)
• Artificial neural networks (e.g., multi-layer perceptrons)
• Tree- or rule-based models (e.g., decision trees)
• Graphical models (e.g., Bayesian networks)
• Ensembles (e.g., Random Forest)
• Instance-based learners (e.g., K-nearest neighbors
Algorithm Categorization Schemes
• Eager vs lazy learners
• Eager learners process training data immediately
• lazy learners defer the processing step until the prediction, e.g., the nearest neighbor algorithm.
• Batch vs online learning
• In batch learning, the model is learned on the entire set of training examples.
• Online learners, in contrast, learn from one training example at the time.
• It is common, in practical applications, to learn a model via batch learning and then update it later
using online learning.
• Parametric vs nonparametric models
• Parametric models are “fixed” models, where we assume a certain functional form for f (x) = y. For
example, linear regression with h(x) = w1x1 + ... + wmxm + b.
• Nonparametric models are more “flexible” and do not have a prespecfied number of parameters.
In fact, the number of parameters grows typically with the size of the training set. For example,
a decision tree would be an example of a nonparametric model, where each decision node (e.g., a
binary “True/False” assertion) can be regarded as a parameter.
Algorithm Categorization Schemes
• Discriminative vs generative
• Generative models (classically) describe methods that model the joint distribution P (X, Y ) =
P (Y )P (X|Y ) = P (X)P (Y|X) for training pairs < x[i], y[i] >.
• Discriminative models are taking a more “direct” approach for modeling P (Y|X) directly.
• While generative models provide typically more insights and allow sampling from the joint
distribution, discriminative models are typically easier to compute and produce more
accurate predictions.
• Discriminative modeling is like trying to extract information from text in a foreign language
without learning that language.
• Generative modeling is like generating text in a foreign language.

More Related Content

PPTX
AI_06_Machine Learning.pptx
PDF
ML_Lec1 introduction to machine learning.pdf
PDF
ML_lec1.pdf
PPTX
Lec1 intoduction.pptx
PDF
Machine learning
PPTX
Machine learning and types
PPTX
Pp ts for machine learning
PDF
Introduction to Data Science
AI_06_Machine Learning.pptx
ML_Lec1 introduction to machine learning.pdf
ML_lec1.pdf
Lec1 intoduction.pptx
Machine learning
Machine learning and types
Pp ts for machine learning
Introduction to Data Science

Similar to Week_1 Machine Learning introduction.pptx (20)

PDF
machinecanthink-160226155704.pdf
PDF
Overview of machine learning
PPTX
Machine Can Think
PPTX
Machine learning and decision trees
PPTX
Machine Learning Seminar
PPT
Machine Learning ICS 273A
PPTX
Machine learning Method and techniques
PPTX
Lecture 09(introduction to machine learning)
PPT
Unit-1.ppt
PDF
Machine Learning Basics and Supervised, unsupervised
PPTX
Predire il futuro con Machine Learning & Big Data
PPTX
Introduction to Machine Learning
PPT
Machine Learning
PDF
Introduction to Machine Learning with SciKit-Learn
PDF
Week 1.pdf
PDF
Presentation-19.08.2024hvug7gugyvuvugugugugugug
PPTX
machine leraning : main principles and techniques
PPTX
Unit - 1 - Introduction of the machine learning
PPTX
Machine_Learning.pptx
machinecanthink-160226155704.pdf
Overview of machine learning
Machine Can Think
Machine learning and decision trees
Machine Learning Seminar
Machine Learning ICS 273A
Machine learning Method and techniques
Lecture 09(introduction to machine learning)
Unit-1.ppt
Machine Learning Basics and Supervised, unsupervised
Predire il futuro con Machine Learning & Big Data
Introduction to Machine Learning
Machine Learning
Introduction to Machine Learning with SciKit-Learn
Week 1.pdf
Presentation-19.08.2024hvug7gugyvuvugugugugugug
machine leraning : main principles and techniques
Unit - 1 - Introduction of the machine learning
Machine_Learning.pptx
Ad

More from muhammadsamroz (7)

PPTX
GEOTHERMAL ENERGY heat from the earth.pptx
PPT
Chapter 2 Linear Control System .ppt
PPTX
Week_8machine learning (feature selection).pptx
PPTX
FYP_Final_Presentation_Sample_N.U.S.T@Co
PPTX
[TEKNOFEST 2024] BMS Issues in Electric-Powered Application[1].pptx
PPTX
Title Defense on AI based portable CBC kit .pptx
PPTX
FTSW - Stay Navy document to us navy .pptx
GEOTHERMAL ENERGY heat from the earth.pptx
Chapter 2 Linear Control System .ppt
Week_8machine learning (feature selection).pptx
FYP_Final_Presentation_Sample_N.U.S.T@Co
[TEKNOFEST 2024] BMS Issues in Electric-Powered Application[1].pptx
Title Defense on AI based portable CBC kit .pptx
FTSW - Stay Navy document to us navy .pptx
Ad

Recently uploaded (20)

PDF
1.3 FINAL REVISED K-10 PE and Health CG 2023 Grades 4-10 (1).pdf
PDF
Weekly quiz Compilation Jan -July 25.pdf
PDF
My India Quiz Book_20210205121199924.pdf
PDF
1_English_Language_Set_2.pdf probationary
PDF
Computing-Curriculum for Schools in Ghana
PPTX
Introduction to Building Materials
PDF
HVAC Specification 2024 according to central public works department
PDF
Chinmaya Tiranga quiz Grand Finale.pdf
PPTX
Unit 4 Computer Architecture Multicore Processor.pptx
PDF
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS
PDF
medical_surgical_nursing_10th_edition_ignatavicius_TEST_BANK_pdf.pdf
PPTX
CHAPTER IV. MAN AND BIOSPHERE AND ITS TOTALITY.pptx
PDF
RTP_AR_KS1_Tutor's Guide_English [FOR REPRODUCTION].pdf
PDF
Τίμαιος είναι φιλοσοφικός διάλογος του Πλάτωνα
PDF
Empowerment Technology for Senior High School Guide
PDF
Trump Administration's workforce development strategy
PDF
Vision Prelims GS PYQ Analysis 2011-2022 www.upscpdf.com.pdf
PPTX
B.Sc. DS Unit 2 Software Engineering.pptx
PDF
FOISHS ANNUAL IMPLEMENTATION PLAN 2025.pdf
PPTX
Onco Emergencies - Spinal cord compression Superior vena cava syndrome Febr...
1.3 FINAL REVISED K-10 PE and Health CG 2023 Grades 4-10 (1).pdf
Weekly quiz Compilation Jan -July 25.pdf
My India Quiz Book_20210205121199924.pdf
1_English_Language_Set_2.pdf probationary
Computing-Curriculum for Schools in Ghana
Introduction to Building Materials
HVAC Specification 2024 according to central public works department
Chinmaya Tiranga quiz Grand Finale.pdf
Unit 4 Computer Architecture Multicore Processor.pptx
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS
medical_surgical_nursing_10th_edition_ignatavicius_TEST_BANK_pdf.pdf
CHAPTER IV. MAN AND BIOSPHERE AND ITS TOTALITY.pptx
RTP_AR_KS1_Tutor's Guide_English [FOR REPRODUCTION].pdf
Τίμαιος είναι φιλοσοφικός διάλογος του Πλάτωνα
Empowerment Technology for Senior High School Guide
Trump Administration's workforce development strategy
Vision Prelims GS PYQ Analysis 2011-2022 www.upscpdf.com.pdf
B.Sc. DS Unit 2 Software Engineering.pptx
FOISHS ANNUAL IMPLEMENTATION PLAN 2025.pdf
Onco Emergencies - Spinal cord compression Superior vena cava syndrome Febr...

Week_1 Machine Learning introduction.pptx

  • 2. COURSE INFORMATION Course Number and Title: EC-452 Machine Learning Credits: 3-0 Instructor(s)-in-charge: Dr. Ahmad Rauf Subhani (Assistant Prof) Course type: Lecture Required or Elective: Elective Course pre-requisites Math-361 Probability and Statistics (Preferred) Degree and Semester DE-42 (Electrical), Semester 7 Month and Year Fall 2023
  • 3. Assessment Course Assessment Exam: 1 Midterm and 1 Final Examination Assignment: ------- Quiz: 6 Quizes Grading: Quiz: 10-15% Assignments: 5-10% Mid Semester Exam: 30-35% Project 0-10% End Semester Exam: 40-50%
  • 4. Topics covered in the Course
  • 5. Introduction to Machine Learning • Machine learning is the field of study that gives computers the ability to learn without being explicitly programmed. — Arthur L. Samuel, AI pioneer, 1959 • A breakthrough in machine learning would be worth ten Microsofts. — Bill Gates, Microsoft Co-Founder
  • 8. Introduction to Machine Learning • Machine Learning • Deep Learning • Artificial Intelligence
  • 10. Introduction to Machine Learning • Machine Learning is a tool. • Like any other tool, it is important to read and understand its user manual. • What are some other daily life tools? • Do we need a user manual of a pen or a tyre???
  • 11. Introduction to Machine Learning • Do we need a user manual of a pen or a tyre???
  • 13. Applications of Machine Learning • Email spam detection • Face detection and matching (e.g., iPhone X) • Web search (e.g., DuckDuckGo, Bing, Google) • Sports predictions • Post office (e.g., sorting letters by zip codes) • ATMs (e.g., reading checks) • Credit card fraud • Drug design • Medical diagnoses • Smart assistants (Apple Siri, Amazon Alexa, . . . ) • Product recommendations (e.g., Netflix, Amazon) • Self-driving cars (e.g., Uber, Tesla) • Language translation (Google translate) • Sentiment analysis • Chat GPT and Google Bard • The list goes on…
  • 14. Exercise • While we proceed in the class, it is a good exercise to think about how machine learning could be applied in these problem areas or tasks listed above: What is the desired outcome? What could the dataset look like? Is this a supervised or unsupervised problem, and what algorithms would you use? (Something to revisit later in this semester.) How would you measure success? What are potential challenges or pitfalls?
  • 15. Common Understanding • Feature: • A measurable property of the object (data) you're trying to analyze. • In datasets, features appear as columns • Feature variable, attribute, measurements, dimension • Examples/ Samples: • Entries in features columns • In datasets, examples/samples, instances, observations appear as row • Target, synonymous to • outcome, ground truth, response variable, dependent variable, (class) label (in classification) • Output / Prediction • Use this to distinguish from targets; here, means output from the model
  • 16. Common Understanding • Classification • A process of categorizing a given set of data (feature or example?) into classes. • The classes are often referred to as target, label or categories. • Regression • A technique for investigating the relationship between independent variables or features and a dependent variable or outcome. It's used as a method for predictive modelling in machine learning, in which an algorithm is used to predict continuous outcomes. y x x2 x1
  • 17. Categories of Machine Learning Supervised Learning Unsupervised Learning Reinforcement Learning  Labelled data  Direct feedback  Predict outcome/future  No labels/target  No feedback  Find hidden structure in data  Decision process  Reward system  Learn series of actions Source: Raschka & Mirjalili: Python Machine Learning, 2nd Ed.
  • 19. Supervised Learning • Learning from labeled training data • Inputs that also contain the desired outputs or targets; basically, “examples” of what we want to predict. x2 x1 Illustration of a binary classification problem (plus, minus) and two feature variable (x1 and x2). (Source: Raschka & Mirjalili: Python Machine Learning, 2nd Ed.)
  • 20. Supervised Learning y x Illustration of a linear regression model with one feature (predictor) variable (x1) and the target (response) variable y. The dashed-line indicates the functional form of the linear regression model. (Source: Raschka & Mirjalili: Python Machine Learning, 2nd Ed.)
  • 21. Unsupervised learning • Unsupervised learning is concerned with unlabelled data • Common tasks in unsupervised learning are clustering analysis (assigning group memberships) and dimensionality reduction (compressing data onto a lower- dimensional subspace or manifold) x2 x1 Illustration of clustering, dashed lines indicate potential group membership assignments of unlabeled data points. (Source: Raschka & Mirjalili: Python Machine Learning, 2nd Ed.)
  • 23. Reinforcement learning • The process of learning from rewards while performing a series of actions • We do not tell the learner, for example, a (ro)bot, which action to take • But merely assign a reward to each action and/or the overall outcome. • Instead of having “correct/false” label for each step, the learner must discover or learn a behavior that maximizes the reward for a series of actions. • Not a supervised setting and somewhat related to unsupervised learning Illustration of reinforcement learning (Source: Raschka & Mirjalili: Python Machine Learning, 2nd Ed.)
  • 24. Common Understanding (Jargons) • Feature: • A measurable property of the object (data) you're trying to analyze. • In datasets, features appear as columns • Predictor, variable, independent variable, input, attribute, covariate • Examples/ Samples (of training and testing): • Entries in features columns • In datasets, examples/samples appear as row • Observation, training record, training instance, training sample (in some contexts, sample refers to a collection of training examples) • Target, synonymous to • outcome, ground truth, output, response variable, dependent variable, (class) label (in classification) • Output / Prediction, use this to distinguish from targets; here, means output from the model
  • 25. Common Understanding (Jargons) • Identify features and examples in the following data?
  • 26. Common Understanding (Jargons) • Supervised learning: • Learn function to map input x (features) to output y (targets) • Structured data: • Databases, spreadsheets/csv files • Unstructured data: • Features like image pixels, audio signals, text sentences (before DL, extensive feature engineering required)
  • 29. A Roadmap for Building Machine Learning Systems Label s Ra w Dat a Training Dataset Test Dataset Label s Learnin g Algorith m Preprocessi ng Learnin g Evaluatio n New Data Labels Predic tion Final Model Feature Extraction and Scaling Feature Selecti on i Dimensionality Reduction Sampling Model Selection Cross-Validation Performance Metrics Hyperparameter Optimization Mostly not needed in DL
  • 30. Supervised Learning (Notation) Unknown function: Hypothesis: f(x) =y h(x) =y h :ℝm → ℝ h :ℝm → 𝒴, 𝒴 ={1,...,k} Classification Regression Training set: 𝒟 ={⟨x[i],y[i]⟩,i = 1,…,m}, "training examples"
  • 32. Cont Feature vector X = x1 x2 ⋮ xm x = x1 x2 ⋮ xn X = x[1] x[1] 1 2 x[1] n x[2] x[2] x[2] x[m] x[m] 1 2 ⋯ 1 2 ⋯ n ⋮ ⋮ ⋱ ⋮ ⋯ x[m] n Design Matrix Design Matrix [i]T
  • 33. m = n = Data Representation (structured data) 33
  • 34. Entire hypothesis space Hypothesis space a particular learning algorithm category has access to Hypothesis space a particular learning algorithm can sample Particular hypothesis (i.e., a model/classifier) Hypothesis Space
  • 35. Classes of Machine Learning Algorithms Below are some classes of algorithms that we are going to discuss in this class: • Generalized linear models (e.g., logistic regression) • Support vector machines (e.g., linear SVM, RBF-kernel SVM) • Artificial neural networks (e.g., multi-layer perceptrons) • Tree- or rule-based models (e.g., decision trees) • Graphical models (e.g., Bayesian networks) • Ensembles (e.g., Random Forest) • Instance-based learners (e.g., K-nearest neighbors
  • 36. Algorithm Categorization Schemes • Eager vs lazy learners • Eager learners process training data immediately • lazy learners defer the processing step until the prediction, e.g., the nearest neighbor algorithm. • Batch vs online learning • In batch learning, the model is learned on the entire set of training examples. • Online learners, in contrast, learn from one training example at the time. • It is common, in practical applications, to learn a model via batch learning and then update it later using online learning. • Parametric vs nonparametric models • Parametric models are “fixed” models, where we assume a certain functional form for f (x) = y. For example, linear regression with h(x) = w1x1 + ... + wmxm + b. • Nonparametric models are more “flexible” and do not have a prespecfied number of parameters. In fact, the number of parameters grows typically with the size of the training set. For example, a decision tree would be an example of a nonparametric model, where each decision node (e.g., a binary “True/False” assertion) can be regarded as a parameter.
  • 37. Algorithm Categorization Schemes • Discriminative vs generative • Generative models (classically) describe methods that model the joint distribution P (X, Y ) = P (Y )P (X|Y ) = P (X)P (Y|X) for training pairs < x[i], y[i] >. • Discriminative models are taking a more “direct” approach for modeling P (Y|X) directly. • While generative models provide typically more insights and allow sampling from the joint distribution, discriminative models are typically easier to compute and produce more accurate predictions. • Discriminative modeling is like trying to extract information from text in a foreign language without learning that language. • Generative modeling is like generating text in a foreign language.