SlideShare a Scribd company logo
2
Most read
5
Most read
11
Most read
Computational
Learning theory
By: Zoobia Rana
Cms ID: 2977-2023
What is Computational Learning Theory?
Computational Learning Theory (CoLT) is a field of AI
research studying the design of machine learning algorithms
to determine what sorts of problems are “learnable.” The
ultimate goals are to understand the theoretical
underpinnings of deep learning programs, what makes them
work or not, while improving accuracy and efficiency.
This research field merges many disciplines, such
as probability theory, statistics, programming optimization,
information theory, calculus and geometry
Machine Learning (ML)
is the scientific study of algorithms and statistical
models that computer systems use to perform a specific task
without using explicit instructions, relying on patterns
and inference instead. It is seen as a subset of artificial
intelligence. Machine learning algorithms build a mathematical
model based on sample data, known as "training data", in order
to make predictions or decisions without being explicitly
programmed to perform the task.[ Machine learning algorithms
are used in a wide variety of applications, such as email
filtering and computer vision, where it is difficult or infeasible to
develop a conventional algorithm for effectively performing the
task.
Computational learning theory studies the
time complexity and feasibility of learning
In computational learning theory, a computation is considered feasible if it can be
done in polynomial time. There are two kinds of time complexity results:
Positive results – Showing that a certain class of functions is learnable in
polynomial time.
Negative results – Showing that certain classes cannot be learned in polynomial
time.
Negative results often rely on commonly believed, but yet unproven
assumptions,[citation needed] such as:
Computational complexity – P ≠ NP (the P versus NP problem);
Cryptographic – One-way functions exist.
The goal of computing theory
To make explicit
relevant aspects
of the learner and
the environment
To identify easy
and hard learning
problems (and the
precise conditions
under which they
are easy or hard)
To guide the
design of learning
systems
To help analyze
the performance
of learning system
To shed light on
natural learning
systems To develop and
analyze models
Computational
Models of Learning
• Mistake bound model
• The Halving Algorithms
• PAC (Probably Approximately Correct) model
• Occam Algorithm
• VC Dimension and Sample Complexity
• Kolmogorov Complexity
• VC Dimension and Sample Complexity
PAC stands for Probably
Approximately Correct
It mean that machine learning approaches
offer a probability solution for a given problem
and this solution tends to be approximaltely
correct
What are learning
systems?
Pattern recognizers, adaptive control
systems, adaptive intelligent agents,
etc.
Examples
02
Systems that improve their
performance one or more tasks with
experience in their environment
It is
01
Theories of
Learning: What are
they good for?
Mistake
Bound
model Given an arbitrary, noise-free sequence of label
examples of an unknown binary conjunctive concept C
over {0,1} N, the learner's task is to predict C(X) for a
given X.
Mistake bound Model
• Initialize H ={X 1; ~X 1, ~X 2, ~X 3, ...}
• Predict according to match between an instance and
the conjunction of literals in H
• Whenever a mistake is made on a positive example,
drop the offending literals from Hs
Figure: Diagram
PAC Learning
Consider:
>An instance space X
>A concept space C = { C:X → {0,1}}
>A hypothesis space H= { H:X → {0.1}}
>An unknown, arbitrary, not necessarily
computable, stationary probability
distribution D over the instance space X
PAC Learning
The oracle samples the instance space
according to D and provides labeled
examples of an unknown concept C to
the learner.
The learner is tested on samples drawn
from the instance space according to
the same probability distribution D .
The learner's task is to output a
hypothesis h from H that closely
approximates the unknown concept C
based on the examples it has
encountered
In the PAC setting, exact learning (zero
error approximation) cannot be
guaranteed
In the PAC setting, even approximate
learning (with bounded non-zero error)
cannot be guaranteed 100% of the time
PAC Learning
is a measure of the capacity of a space of functions
that can be learned by a statistical classification
algorithm.
VC Dimension
Vapnik–Chervonenkis Dimension
Shattering
A hypothesis class H can shatter N data points for
which we can find a hypothesis H that separates
the positive examples from negative for every problem,
then we say H shatters N points.
THANK YOU

More Related Content

PDF
1_2 Introduction to Machine Learning.pdf
PDF
Introduction to machine learning-2023-IT-AI and DS.pdf
PPT
.ppt
PPTX
Intro to machine learning
PPTX
Machine learning with ADA Boost
PPT
notes as .ppt
PDF
Incorporating Prior Domain Knowledge Into Inductive Machine ...
PDF
ml basics ARTIFICIAL INTELLIGENCE, MACHINE LEARNING, TYPES OF MACHINE LEARNIN...
1_2 Introduction to Machine Learning.pdf
Introduction to machine learning-2023-IT-AI and DS.pdf
.ppt
Intro to machine learning
Machine learning with ADA Boost
notes as .ppt
Incorporating Prior Domain Knowledge Into Inductive Machine ...
ml basics ARTIFICIAL INTELLIGENCE, MACHINE LEARNING, TYPES OF MACHINE LEARNIN...

Similar to Computational Learning Theory ppt.pptxhhhh (20)

PPTX
Lec1 intoduction.pptx
PPTX
Introduction to Machine Learning
PPTX
Chapter 6 - Learning data and analytics course
PPTX
Rahul_Kirtoniya_11800121032_CSE_Machine_Learning.pptx
PPTX
Machine Learning basics
PDF
Machine learning interview questions and answers
PPTX
Machine learning
PDF
A Few Useful Things to Know about Machine Learning
PDF
AI Lesson 33
PDF
Lesson 33
PPTX
Machine learning ppt unit one syllabuspptx
PPT
Computational Learning Theory
PDF
Machine learning (5)
DOCX
Essentials of machine learning algorithms
PPTX
ML Lec 1 (1).pptx
PPT
Introduction to Machine Learning.
PPT
Machine learning introduction to unit 1.ppt
PPTX
Ml ppt at
PDF
Machine Learning - Deep Learning
PPTX
AIML UNIT-4 FROM JNTUK SYLLABUS PRESENTATION.pptx
Lec1 intoduction.pptx
Introduction to Machine Learning
Chapter 6 - Learning data and analytics course
Rahul_Kirtoniya_11800121032_CSE_Machine_Learning.pptx
Machine Learning basics
Machine learning interview questions and answers
Machine learning
A Few Useful Things to Know about Machine Learning
AI Lesson 33
Lesson 33
Machine learning ppt unit one syllabuspptx
Computational Learning Theory
Machine learning (5)
Essentials of machine learning algorithms
ML Lec 1 (1).pptx
Introduction to Machine Learning.
Machine learning introduction to unit 1.ppt
Ml ppt at
Machine Learning - Deep Learning
AIML UNIT-4 FROM JNTUK SYLLABUS PRESENTATION.pptx
Ad

More from zoobiarana76 (14)

PPT
quicksort.ppthhhhhhhhhhhhhhhhhhhhhhhhhhh
PPTX
dsppt-141121224848-conversion01 (1).pptx
PPTX
STQA Lecture Slides 09.pptxjjjjjjjjjjjjj
PPTX
Lecture 3 cs101aaaajajjajajajajjahhahah.pptx
PPTX
Lecture 2 cs101ahhahahahahhahhahhah.pptx
PPTX
Lecture 1avavvavavavavavavavvavavvavavava
PDF
lecture2-180129175419 (1).pdfhhhhhhhhhhh
PPTX
PPT QC.pptxhjhjhkkkkkkkkhhuhuyyuyyyyyyyy
PPTX
bbbnnjxhxshjsjskshsjsjshssddhjddjdjddhgd
PDF
teteuueieoeofhfhfjffkkkfkfflflflhshssnnvmvvmvv,v,v,nnxmxxm
PPT
hddhdhdhdhdhdhdhdhdhddhddhdhdhdhddhdhdddhdhdh
PDF
csonqc-150120184546-conversion-gate01.pdf
PPTX
Lecture1.pptxjendfkdmdmmdmmedhf bf fbbd ed
PPT
dorit-day1-1.pptedefefgfgjjjjjf2iufhhbfiurb
quicksort.ppthhhhhhhhhhhhhhhhhhhhhhhhhhh
dsppt-141121224848-conversion01 (1).pptx
STQA Lecture Slides 09.pptxjjjjjjjjjjjjj
Lecture 3 cs101aaaajajjajajajajjahhahah.pptx
Lecture 2 cs101ahhahahahahhahhahhah.pptx
Lecture 1avavvavavavavavavavvavavvavavava
lecture2-180129175419 (1).pdfhhhhhhhhhhh
PPT QC.pptxhjhjhkkkkkkkkhhuhuyyuyyyyyyyy
bbbnnjxhxshjsjskshsjsjshssddhjddjdjddhgd
teteuueieoeofhfhfjffkkkfkfflflflhshssnnvmvvmvv,v,v,nnxmxxm
hddhdhdhdhdhdhdhdhdhddhddhdhdhdhddhdhdddhdhdh
csonqc-150120184546-conversion-gate01.pdf
Lecture1.pptxjendfkdmdmmdmmedhf bf fbbd ed
dorit-day1-1.pptedefefgfgjjjjjf2iufhhbfiurb
Ad

Recently uploaded (20)

PDF
BÀI TẬP BỔ TRỢ 4 KỸ NĂNG TIẾNG ANH 9 GLOBAL SUCCESS - CẢ NĂM - BÁM SÁT FORM Đ...
PDF
O5-L3 Freight Transport Ops (International) V1.pdf
PDF
RMMM.pdf make it easy to upload and study
PDF
Classroom Observation Tools for Teachers
PPTX
Cell Types and Its function , kingdom of life
PPTX
Institutional Correction lecture only . . .
PPTX
PPH.pptx obstetrics and gynecology in nursing
PDF
Physiotherapy_for_Respiratory_and_Cardiac_Problems WEBBER.pdf
PDF
102 student loan defaulters named and shamed – Is someone you know on the list?
PPTX
Pharmacology of Heart Failure /Pharmacotherapy of CHF
PDF
Abdominal Access Techniques with Prof. Dr. R K Mishra
PPTX
school management -TNTEU- B.Ed., Semester II Unit 1.pptx
PPTX
Renaissance Architecture: A Journey from Faith to Humanism
PDF
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
PDF
Sports Quiz easy sports quiz sports quiz
PPTX
PPT- ENG7_QUARTER1_LESSON1_WEEK1. IMAGERY -DESCRIPTIONS pptx.pptx
PPTX
Final Presentation General Medicine 03-08-2024.pptx
PPTX
Pharma ospi slides which help in ospi learning
PDF
Complications of Minimal Access Surgery at WLH
PDF
grade 11-chemistry_fetena_net_5883.pdf teacher guide for all student
BÀI TẬP BỔ TRỢ 4 KỸ NĂNG TIẾNG ANH 9 GLOBAL SUCCESS - CẢ NĂM - BÁM SÁT FORM Đ...
O5-L3 Freight Transport Ops (International) V1.pdf
RMMM.pdf make it easy to upload and study
Classroom Observation Tools for Teachers
Cell Types and Its function , kingdom of life
Institutional Correction lecture only . . .
PPH.pptx obstetrics and gynecology in nursing
Physiotherapy_for_Respiratory_and_Cardiac_Problems WEBBER.pdf
102 student loan defaulters named and shamed – Is someone you know on the list?
Pharmacology of Heart Failure /Pharmacotherapy of CHF
Abdominal Access Techniques with Prof. Dr. R K Mishra
school management -TNTEU- B.Ed., Semester II Unit 1.pptx
Renaissance Architecture: A Journey from Faith to Humanism
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
Sports Quiz easy sports quiz sports quiz
PPT- ENG7_QUARTER1_LESSON1_WEEK1. IMAGERY -DESCRIPTIONS pptx.pptx
Final Presentation General Medicine 03-08-2024.pptx
Pharma ospi slides which help in ospi learning
Complications of Minimal Access Surgery at WLH
grade 11-chemistry_fetena_net_5883.pdf teacher guide for all student

Computational Learning Theory ppt.pptxhhhh

  • 2. What is Computational Learning Theory? Computational Learning Theory (CoLT) is a field of AI research studying the design of machine learning algorithms to determine what sorts of problems are “learnable.” The ultimate goals are to understand the theoretical underpinnings of deep learning programs, what makes them work or not, while improving accuracy and efficiency. This research field merges many disciplines, such as probability theory, statistics, programming optimization, information theory, calculus and geometry
  • 3. Machine Learning (ML) is the scientific study of algorithms and statistical models that computer systems use to perform a specific task without using explicit instructions, relying on patterns and inference instead. It is seen as a subset of artificial intelligence. Machine learning algorithms build a mathematical model based on sample data, known as "training data", in order to make predictions or decisions without being explicitly programmed to perform the task.[ Machine learning algorithms are used in a wide variety of applications, such as email filtering and computer vision, where it is difficult or infeasible to develop a conventional algorithm for effectively performing the task.
  • 4. Computational learning theory studies the time complexity and feasibility of learning In computational learning theory, a computation is considered feasible if it can be done in polynomial time. There are two kinds of time complexity results: Positive results – Showing that a certain class of functions is learnable in polynomial time. Negative results – Showing that certain classes cannot be learned in polynomial time. Negative results often rely on commonly believed, but yet unproven assumptions,[citation needed] such as: Computational complexity – P ≠ NP (the P versus NP problem); Cryptographic – One-way functions exist.
  • 5. The goal of computing theory To make explicit relevant aspects of the learner and the environment To identify easy and hard learning problems (and the precise conditions under which they are easy or hard) To guide the design of learning systems To help analyze the performance of learning system To shed light on natural learning systems To develop and analyze models
  • 6. Computational Models of Learning • Mistake bound model • The Halving Algorithms • PAC (Probably Approximately Correct) model • Occam Algorithm • VC Dimension and Sample Complexity • Kolmogorov Complexity • VC Dimension and Sample Complexity
  • 7. PAC stands for Probably Approximately Correct It mean that machine learning approaches offer a probability solution for a given problem and this solution tends to be approximaltely correct
  • 8. What are learning systems? Pattern recognizers, adaptive control systems, adaptive intelligent agents, etc. Examples 02 Systems that improve their performance one or more tasks with experience in their environment It is 01
  • 9. Theories of Learning: What are they good for?
  • 10. Mistake Bound model Given an arbitrary, noise-free sequence of label examples of an unknown binary conjunctive concept C over {0,1} N, the learner's task is to predict C(X) for a given X. Mistake bound Model • Initialize H ={X 1; ~X 1, ~X 2, ~X 3, ...} • Predict according to match between an instance and the conjunction of literals in H • Whenever a mistake is made on a positive example, drop the offending literals from Hs Figure: Diagram
  • 11. PAC Learning Consider: >An instance space X >A concept space C = { C:X → {0,1}} >A hypothesis space H= { H:X → {0.1}} >An unknown, arbitrary, not necessarily computable, stationary probability distribution D over the instance space X
  • 12. PAC Learning The oracle samples the instance space according to D and provides labeled examples of an unknown concept C to the learner. The learner is tested on samples drawn from the instance space according to the same probability distribution D . The learner's task is to output a hypothesis h from H that closely approximates the unknown concept C based on the examples it has encountered
  • 13. In the PAC setting, exact learning (zero error approximation) cannot be guaranteed In the PAC setting, even approximate learning (with bounded non-zero error) cannot be guaranteed 100% of the time PAC Learning
  • 14. is a measure of the capacity of a space of functions that can be learned by a statistical classification algorithm. VC Dimension Vapnik–Chervonenkis Dimension
  • 15. Shattering A hypothesis class H can shatter N data points for which we can find a hypothesis H that separates the positive examples from negative for every problem, then we say H shatters N points.