SlideShare a Scribd company logo
Lesson 04: Uncertainty
Pt.1 Probabilistic Methods
Uncertainty, belief, probability, fuzziness
Encuesta UPM
El enlace a las encuestas es:
http://servicios.upm.es/encuestas
Entrando con la dirección de correo sin
la parte “@alumnos.upm.es” y la clave
Encuestas de la asignatura y de los
profesores en la misma sesión.
IA / 4 Uncertainty 3
Uncertain (adjective)
 doubtful
 a: not known beyond doubt
 b: not having certain knowledge
 c: not clearly identified or defined
 not constant
 indefinite, indeterminate
 not certain to occur
 not reliable, untrustworthy
Content
 The problem of uncertainty
 Handling uncertainty
 Probabilistic Methods
 Fuzzy Methods
IA / 4 Uncertainty 4
The problem of uncertainty
IA / 4 Uncertainty 6
From clean worlds to noisy worlds
 The AI methods we have seen so far have
representations that perfectly match their “worlds”
 Inferences by sound reasoning can then be translated
to the real world and they make sense there
 Unfortunately, this is not
always the situation that
we have in real-life
 World is noisy, perception
is noisy, reasoning is noisy
IA / 4 Uncertainty 7
An historical example of the problem
 Expert Systems leverage expert decision making in a
limited domain (medical diagnosis, computer
configuration, machine fault diagnosis, etc.).
 Knowledge is extracted from a willing expert
 Knowledge is usually represented as inference rules
 These systems can replace human experts in their
narrow, precise, domains of expertise
IA / 4 Uncertainty 8
The MYCIN Expert System
 An AI program for diagnosis and therapy selection for
bacterial infections of the blood
 Provide advice to non-expert physicians with time
considerations and incomplete evidence
 Example MYCIN rule:
if stain of organism is gramneg and
morphology is rod and
aerobicity is aerobic
then strongly suggestive (0.8) that
organism is enterobacteriaceae.
IA / 4 Uncertainty 9
MYCIN Architecture
Consultation
System
Explanation
System
Knowledge
Acquisition
System
Q-A System
Dynamic DB
Patient Data
Context Tree
Dynamic Data
Static DB
Rules
Parameter Properties
Context Type Properties
Tables, Lists
Physician
Expert
IA / 4 Uncertainty 10
Knowledge and Reasoning process
 MYCIN domain-specific knowledge in over 450 rules
 Each rule is completely modular, all relevant context is
contained in the rule with explicitly stated premises
 Inexact reasoning uses certainty factors (CFs):
 Facts and rules have assigned confidences, from 0 to 1.
 Truth of a hypothesis is measured by a sum of the CFs
 Positive sum is confirming evidence
 Negative sum is disconfirming evidence
 MYCIN handled uncertainty. To an extent.
IA / 4 Uncertainty 11
Handling uncertainty
IA / 4 Uncertainty 13
Different problems to consider
 Imprecision and accuracy
 Lack of exactness or accuracy
 “The result was 13.8 ± 0.3”
 Vagueness
 Lack of certainty or distinctness
 Lack of preciseness in thought or communication
 “The result was quite good”
 Uncertainty
 Not able to be relied on; not known or definite
 Not completely confident or sure of something
 “I believed that the result was 13”
Precision:
precisión
Accuracy:
exactitud
Uncertainty in AI
 Uncertainty is pervasive in “real-world” AI
 AI agents suffer uncertainty in:
 Perception
 Representation
 Reasoning
 Action
 In this lesson we will show a couple of methods to deal
with uncertainty: probabilistic and fuzzy
IA / 4 Uncertainty 14
Thought
Representation
IA / 4 Uncertainty 15
Limits of (classical) logics
 Trying to use logic to cope with a real-world domain like
medical diagnosis fails for three main reasons:
 Laziness/time/cost: It is too much work to list the complete set of
antecedents or consequents needed to ensure an exceptionless rule and
too hard to use such rules.
 Theoretical ignorance: Medical science –or automobile engineering– has
no complete, usable, theory for the domain.
 Practical ignorance: Even if we know all the rules, we might be uncertain
about a particular patient because not all the necessary tests have been
or can be run or have noise or …
 c.f. the frame, qualification and ramification problems
IA / 4 Uncertainty 16
Elimination of simplifying limitations
 In previous lessons we have seen fully observable,
deterministic, static, known environments
 In these worlds the solution to a problem is a precise
sequence of actions.
 For AIs addressing the real world we shall relax those
constraints: real world is not so neat.
 Agents in the real world need to handle uncertainty,
whether due to partial observability, nondeterminism,
dynamism, or the existence of adversaries.
IA / 4 Uncertainty 17
Two Classes of Uncertainty
 Uncertainty –messiness in the mind of the agent– can
have two origins:
 Ontological: The world is messy
 Epistemological: The world is neat but
the agent can’t get a perfect picture
of it*
Representation
Real-world
* Neat in terms of the representation formalism
IA / 4 Uncertainty 18
Ontological
Epistemological
a dimly seen rabbit a Schrödinger cat
IA / 4 Uncertainty 19
Copilot, please make a
poem about a dimly seen
rabbit using quantum
mechanics terms
In the twilight's gentle embrace,
A rabbit's form, a fleeting trace,
In quantum realms, it hops and plays,
In superposition, it spends its days.
With wavefunctions that softly sway,
Its path uncertain, it cannot stay,
Entangled with the moonlit night,
A particle of pure delight.
Through Heisenberg's uncertain veil,
Its position and momentum frail,
A quantum leap, a boundless flight,
In the dimly seen, it finds its light.
In Schrödinger's box, it might reside,
Both here and there, it does not hide,
A paradox in every hop,
In quantum fields, it cannot stop.
So let us ponder, let us dream,
Of rabbits in a quantum scheme,
Where physics and the night entwine,
In a dance both wondrous and divine.
“Through Heisenberg's uncertain veil”
IA / 4 Uncertainty 20
We’ll see two strategies
 Probabilistic methods
 Deal with epistemological uncertainty
 Fuzzy methods
 Deal with ontological uncertainty
 Other:
 Necessity/Possibility theory
 Depmster/Shafer belief functions theory
 Plaussibility theory
 Qualitative / Interval logics
 …
Probabilistic Methods
IA / 4 Uncertainty 22
Dealing with degrees of belief
 In general, when there is a connection between real-
world facts (cause  effect, if P then Q), there are no
strict logical consequences in other directions.
The world is complex (if P R
∧ S
∧ T
∧ …
∧ then Q).
 the qualification problem
 Rain  Wet road
 Wet road  Rain
I notice the road is
wet.
Could it have rained?
IA / 4 Uncertainty 23
Knowledge and Belief
 Philosophers said that
knowledge is “justified, true, belief”; however, …
 The agent’s “knowledge” can at best provide only a
degree of belief in the relevant sentences.
 Our main tool for dealing with degrees of belief is
probability theory.
 Probability theory talks also about “possible worlds”.
IA / 4 Uncertainty 24
Probability theory commitments
 Probability provides a way of summarizing the
uncertainty that comes from our laziness and
ignorance, thereby “solving” the qualification problem.
 Commitments in logic and probability:
 The ontological commitments are the same—that the world is composed
of facts that hold or not.
 The epistemological commitments are different: a logical agent believes
each sentence to be {true, false, no opinion} whereas a probabilistic
agent may have a numerical degree of belief between 0 (for sentences
that are certainly false) and 1 (certainly true).
IA / 4 Uncertainty 25
Fundamental concepts in probability
 Basics of probability:
 Random Variables
 Joint and Marginal Distributions
 Conditional Distribution
 Product Rule, Chain Rule, Bayes’ Rule
 Inference
 Independence
 Uses: Bayesian networks, Markov chains, etc.
 This is basic probability stuff (statistics) that is used a
lot in current AI and robotics methods
IA / 4 Uncertainty 26
Probabilitstic Inference in Ghostbusters
 A ghost is hidden in the grid somewhere
 Sensor readings tell how close a square is to the ghost
 On the ghost: red
 1 or 2 away: orange
 3 or 4 away: yellow
 5+ away: green
 If sensors are noisy, we may
knowP(Color | Distance)
P(red | 3) P(orange | 3) P(yellow | 3) P(green | 3)
0.05 0.15 0.5 0.3
IA / 4 Uncertainty 27
Dealing with Uncertainty
 General situation:
 Observed variables (evidence): Agent knows certain things
about the state of the world (e.g., sensor readings or
symptoms)
 Unobserved variables: Agent needs to reason about other
aspects (e.g. where an object is or what disease is present)
 Model: Agent knows something about how the known
variables relate to the unknown variables
 Probabilistic reasoning gives us a framework
for managing our beliefs (e.g. with new info)
IA / 4 Uncertainty 28
Random Variables
 A random variable is some aspect of the world about
which we (may) have uncertainty
 R = Is it raining?
 T = Is it hot or cold?
 D = How long will it take to drive to work?
 L = Where is the ghost?
 Random variables have ranges (maybe non-numeric)
 R in {true, false} (often writen as {+r, -r})
 T in {hot, cold}
 D in [0, )
 L in possible locations, maybe {(0,0), (0,1), …}
We denote random
variables with upper
case letters: R,T,…
IA / 4 Uncertainty 29
Probability Distributions
Random variables have probability distributions
Associate a probability with each value in the range
T P
hot 0.5
cold 0.5
W P
sun 0.6
rain 0.1
fog 0.3
meteor 0.0
Temperature
Weather
A probability value for each “possible world”
IA / 4 Uncertainty 30
Ranges may be multidimensional
 A two dimensional
normal distribution
Ranges may also be discrete or continuous
IA / 4 Uncertainty 31
Shorthand notation:
Probability Distributions
 A distribution is a function from
values to probabilities.
 A probability (lower case) is a
single number
p =
 Axioms for a valid “probability
distribution”:
IA / 4 Uncertainty 32
Joint Distributions
 A joint distribution over a set of random variables
specifies a real number for each tuple (or outcome):
 Must obey:
T W P
hot sun 0.4
hot rain 0.1
cold sun 0.2
cold rain 0.3
A probabilistic model
is a joint distribution
IA / 4 Uncertainty 33
Events
 An event is a set E of outcomes
 From a joint distribution, we can
calculate the probability of any event
 Probability that it’s hot AND sunny? 0.4
 Probability that it’s hot? 0.4+0.1 = 0.5
 Probability that it’s hot OR sunny? 0.4+0.1+0.2=0.7
 Typically, the events we care about are
partial assignments, like P(W=sunny)
T W P
hot sun 0.4
hot rain 0.1
cold sun 0.2
cold rain 0.3
“W=sunny” event
P(W=sunny) = 0.6
IA / 4 Uncertainty 34
Quiz: Events
 P(+x AND +y) ?
 P(+x) ?
 P(-y OR +x) ?
X Y P
+x +y 0.2
+x -y 0.3
-x +y 0.4
-x -y 0.1
IA / 4 Uncertainty 35
Marginal Distributions
 Marginal distributions are sub-tables which eliminate variables
 Marginalization (summing out): Combine collapsed rows by
adding
T W P
hot sun 0.4
hot rain 0.1
cold sun 0.2
cold rain 0.3
T P
hot 0.5
cold 0.5
W P
sun 0.6
rain 0.4
IA / 4 Uncertainty 36
Quiz: Compute Marginal Distributions
X Y P
+x +y 0.2
+x -y 0.3
-x +y 0.4
-x -y 0.1
X P
+x
-x
Y P
+y
-y
IA / 4 Uncertainty 37
Quiz: Compute Marginal Distributions
X Y P
+x +y 0.2
+x -y 0.3
-x +y 0.4
-x -y 0.1
X P
+x 0.5
-x 0.5
Y P
+y 0.6
-y 0.4
IA / 4 Uncertainty 38
Conditional Probabilities
 Conditional probability is the probability of a, given b
 What is the probability of “sun” if it is “cold”?
 What is the probability of a sunny day being cold (P(cold|sunny))?
 This is the definition of a conditional probability
 This is a simple relation between joint, conditional and
marginal probabilities
b is called “evidence”
Conditional Probabilities
 Use the simple relation between joint and conditional
probabilities (probability of a, given b) taken from the
definition of conditional probability
IA / 4 Uncertainty 39
T W P
hot sun 0.4
hot rain 0.1
cold sun 0.2
cold rain 0.3
What is the probability of “sun” if it is “cold”?
Quiz: Conditional Probabilities
 P(+x | +y) ?
 P(-x | +y) ?
IA / 4 Uncertainty 40
X Y P
+x +y 0.2
+x -y 0.3
-x +y 0.4
-x -y 0.1
Conditional Distributions
 Conditional distributions are probability distributions
over some variables given fixed values of others
IA / 4 Uncertainty 41
T W P
hot sun 0.4
hot rain 0.1
cold sun 0.2
cold rain 0.3
W P
sun 0.8
rain 0.2
W P
sun 0.4
rain 0.6
Conditional
Distributions
Joint
Distribution
IA / 4 Uncertainty 42
Normalization Trick (scale to 1.0)
T W P
hot sun 0.4
hot rain 0.1
cold sun 0.2
cold rain 0.3
W P
sun 0.4
rain 0.6
IA / 4 Uncertainty 43
SELECT the joint
probabilities
matching the
evidence (T=cold)
Normalization Trick
T W P
hot sun 0.4
hot rain 0.1
cold sun 0.2
cold rain 0.3
W P
sun 0.4
rain 0.6
T W P
cold sun 0.2
cold rain 0.3
NORMALIZE the
selection
(make it sum 1.0)
IA / 4 Uncertainty 44
Quiz: Normalization Trick
X Y P
+x +y 0.2
+x -y 0.3
-x +y 0.4
-x -y 0.1
SELECT the joint
probabilities
matching the
evidence
NORMALIZE the
selection
(make it sum 1.0)
 P(X | Y=-y) ?
IA / 4 Uncertainty 45
Independence
 Two random variables are said to be independent if
they do not condition each other:
P(a|b)=P(a)
P(b|a)=P(b)
P(a∧b)=P(a)P(b)
 When it is available, independence can help in reducing
the size of the domain representation and the
complexity of the inference problem.
Probabilistic Inference
 Probabilistic inference is the computation of a specific
probability from other, known probabilities
(e.g. conditional from joint)
 We generally compute conditional probabilities
 P(on time | no reported accidents) = 0.90
 These represent the agent’s beliefs given the evidence
 Probabilities change with new evidence:
 P(on time | no accidents, 5 a.m.) = 0.95
 P(on time | no accidents, 5 a.m., raining) = 0.80
 Observing new evidence causes beliefs to be updated
IA / 4 Uncertainty 46
Probabilistic Inference Rules
 From the definition of a conditional probability
 We can derive some inference rules:
 The product rule
 The chain rule
 Bayes’ rule
IA / 4 Uncertainty 47
The Product Rule
 Compute joint distributions from conditional and
marginal distributions
IA / 4 Uncertainty 48
IA / 4 Uncertainty 49
Quiz: The Product Rule
R P
sun 0.8
rain 0.2
D W P
wet sun 0.1
dry sun 0.9
wet rain 0.7
dry rain 0.3
D W P
wet sun 0.08
dry sun 0.72
wet rain 0.14
dry rain 0.06
0.08
0.72
0.14
0,06
Sum = 1.0
The Chain Rule
 Compute joint distribution as an incremental product of
conditional distributions
IA / 4 Uncertainty 50
IA / 4 Uncertainty 51
The chain rule
P(6, 6, 6)?
P(J, J, J)?
Bayes’ Rule
 There are two ways to factor a joint distribution over
two variables:
 Dividing, we get Bayes rule:
 Why is this at all useful?
 Derive one conditional probability from its reverse (often one
conditional is tricky but the other one is simpler, e.g. due to causality).
 Foundation of many probabilistic inference systems
IA / 4 Uncertainty 52
Inference with Bayes’ Rule
 Example: Computing diagnostic probability from causal
probability (cause  effect)
 Meningitis causes stiff neck (p=0.8)
 I have a stiff neck. Should I worry?
 What is the probability of having meningitis if I have a
stiff neck? P(+m|+s)
IA / 4 Uncertainty 53
Inference with Bayes’ Rule
 Example: Diagnostic probability from causal probability:
 Example:
 M: meningitis, S: stiff neck
 Note: posterior probability of meningitis still very small
 Note: you should still get stiff necks checked out! Why?
IA / 4 Uncertainty 54
Known data
= 0.007944
What is the probability of
having meningitis if I have a sti
neck? P(+m|+s)
IA / 4 Uncertainty 55
Quiz: Bayes’ Rule
 Given:
 What is P(W | dry) ?
R P
sun 0.8
rain 0.2
D W P
wet sun 0.1
dry sun 0.9
wet rain 0.7
dry rain 0.3
IA / 4 Uncertainty 56
Uses of probability
 Probabilistic reasoning:
 Bayesian Networks
 Probabilistic reasoning over time:
 Markov Models
 Decision making:
 Utility functions, single and multiobjective
Decision theory = probability theory + utility theory
 Markov Decision Processes
 Multiagent decision making:
 Game theory
Mathematics
Statistics
Economy
IA / 4 Uncertainty 57
Bayesian networks
 Bayesian networks are a way to represent probabilistic
relationships to capture uncertain knowledge.
 They can be used to perform probabilistic inference:
 Exact probabilistic inference is computationally intractable in the worst
case, but can be used in many practical situations.
 If not, approximate inference algorithms are often applicable.
 Other names for Bayesian networks:
 They were called belief networks in the 1980s and 1990s.
 A causal network is a Bayesian network with additional constraints on
the meaning of the arrows.
 Graphical models are a broader class that includes Bayesian networks.
IA / 4 Uncertainty 58
Bayesian networks
 A typical Bayesian
network, showing both
the topology and the
conditional probability
tables (CPTs).
End of Lesson
Uncertainty
Pt.1 Probabilistic Methods

More Related Content

PPTX
Uncertain Knowledge and Reasoning in Artificial Intelligence
PDF
artificial intelligence 13-quantifying uncertainity.pdf
PPTX
Artificial Intelligence and Machine Learning
PPTX
Uncertainty in computer agent - Copy.pptx
PDF
PDF
Modeling and Evaluating Quality in the Presence of Uncertainty
PPTX
CS3491-Unit-2 Uncertainty.pptx
PPTX
AI_Probability.pptx
Uncertain Knowledge and Reasoning in Artificial Intelligence
artificial intelligence 13-quantifying uncertainity.pdf
Artificial Intelligence and Machine Learning
Uncertainty in computer agent - Copy.pptx
Modeling and Evaluating Quality in the Presence of Uncertainty
CS3491-Unit-2 Uncertainty.pptx
AI_Probability.pptx

Similar to Lesson04-Uncertainty - Pt. 1 Probabilistic Methods.pptx (20)

PPT
Artificial Intelligence Bayesian Reasoning
PPT
Earthquake dhnjggbnkkkknvcxsefghjk gyjhvcdyj
PDF
AI CHAPTER 7.pdf
PPTX
22PCOAM11 Session 22 Acting under uncertainty.pptx
PPTX
Uncertainty in AI
PDF
Probabilistic Reasoning bayes rule conditional .pdf
PPTX
Probability in artificial intelligence.pptx
PDF
Uncertain Knowledge in AI from Object Automation
PDF
Modul Topik 6 - Kecerdasan Buatan.pdf
PPTX
Artificial Intelligence Notes Unit 3
PDF
13-uncertainty.pdf
PDF
2018 MUMS Fall Course - Introduction to statistical and mathematical model un...
PPTX
Representing uncertainty in expert systems
PPT
Uncertainty
PDF
Unexpectedness and Bayes' Rule
PDF
Ways to Solve Problems with Uncertain Knowledge.pdf
PDF
Principles of Health Informatics: Artificial intelligence and machine learning
PDF
Handling Uncertainty using Probability Theory, Fuzzy Logic, and Belief Based ...
PPTX
Russel Norvig Uncertainity - chap 13.pptx
PPTX
artificial intelligence and uncertain reasoning
Artificial Intelligence Bayesian Reasoning
Earthquake dhnjggbnkkkknvcxsefghjk gyjhvcdyj
AI CHAPTER 7.pdf
22PCOAM11 Session 22 Acting under uncertainty.pptx
Uncertainty in AI
Probabilistic Reasoning bayes rule conditional .pdf
Probability in artificial intelligence.pptx
Uncertain Knowledge in AI from Object Automation
Modul Topik 6 - Kecerdasan Buatan.pdf
Artificial Intelligence Notes Unit 3
13-uncertainty.pdf
2018 MUMS Fall Course - Introduction to statistical and mathematical model un...
Representing uncertainty in expert systems
Uncertainty
Unexpectedness and Bayes' Rule
Ways to Solve Problems with Uncertain Knowledge.pdf
Principles of Health Informatics: Artificial intelligence and machine learning
Handling Uncertainty using Probability Theory, Fuzzy Logic, and Belief Based ...
Russel Norvig Uncertainity - chap 13.pptx
artificial intelligence and uncertain reasoning
Ad

Recently uploaded (20)

PDF
VCE English Exam - Section C Student Revision Booklet
PDF
Business Ethics Teaching Materials for college
PDF
Chapter 2 Heredity, Prenatal Development, and Birth.pdf
PPTX
Pharma ospi slides which help in ospi learning
PDF
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
PDF
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
PDF
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
PDF
Insiders guide to clinical Medicine.pdf
PPTX
Renaissance Architecture: A Journey from Faith to Humanism
PDF
Anesthesia in Laparoscopic Surgery in India
PPTX
PPH.pptx obstetrics and gynecology in nursing
PPTX
Introduction to Child Health Nursing – Unit I | Child Health Nursing I | B.Sc...
PDF
102 student loan defaulters named and shamed – Is someone you know on the list?
PPTX
The Healthy Child – Unit II | Child Health Nursing I | B.Sc Nursing 5th Semester
PDF
Microbial disease of the cardiovascular and lymphatic systems
PPTX
human mycosis Human fungal infections are called human mycosis..pptx
PPTX
school management -TNTEU- B.Ed., Semester II Unit 1.pptx
PDF
2.FourierTransform-ShortQuestionswithAnswers.pdf
PDF
TR - Agricultural Crops Production NC III.pdf
PPTX
master seminar digital applications in india
VCE English Exam - Section C Student Revision Booklet
Business Ethics Teaching Materials for college
Chapter 2 Heredity, Prenatal Development, and Birth.pdf
Pharma ospi slides which help in ospi learning
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
Insiders guide to clinical Medicine.pdf
Renaissance Architecture: A Journey from Faith to Humanism
Anesthesia in Laparoscopic Surgery in India
PPH.pptx obstetrics and gynecology in nursing
Introduction to Child Health Nursing – Unit I | Child Health Nursing I | B.Sc...
102 student loan defaulters named and shamed – Is someone you know on the list?
The Healthy Child – Unit II | Child Health Nursing I | B.Sc Nursing 5th Semester
Microbial disease of the cardiovascular and lymphatic systems
human mycosis Human fungal infections are called human mycosis..pptx
school management -TNTEU- B.Ed., Semester II Unit 1.pptx
2.FourierTransform-ShortQuestionswithAnswers.pdf
TR - Agricultural Crops Production NC III.pdf
master seminar digital applications in india
Ad

Lesson04-Uncertainty - Pt. 1 Probabilistic Methods.pptx

  • 1. Lesson 04: Uncertainty Pt.1 Probabilistic Methods Uncertainty, belief, probability, fuzziness
  • 2. Encuesta UPM El enlace a las encuestas es: http://servicios.upm.es/encuestas Entrando con la dirección de correo sin la parte “@alumnos.upm.es” y la clave Encuestas de la asignatura y de los profesores en la misma sesión.
  • 3. IA / 4 Uncertainty 3 Uncertain (adjective)  doubtful  a: not known beyond doubt  b: not having certain knowledge  c: not clearly identified or defined  not constant  indefinite, indeterminate  not certain to occur  not reliable, untrustworthy
  • 4. Content  The problem of uncertainty  Handling uncertainty  Probabilistic Methods  Fuzzy Methods IA / 4 Uncertainty 4
  • 5. The problem of uncertainty
  • 6. IA / 4 Uncertainty 6 From clean worlds to noisy worlds  The AI methods we have seen so far have representations that perfectly match their “worlds”  Inferences by sound reasoning can then be translated to the real world and they make sense there  Unfortunately, this is not always the situation that we have in real-life  World is noisy, perception is noisy, reasoning is noisy
  • 7. IA / 4 Uncertainty 7 An historical example of the problem  Expert Systems leverage expert decision making in a limited domain (medical diagnosis, computer configuration, machine fault diagnosis, etc.).  Knowledge is extracted from a willing expert  Knowledge is usually represented as inference rules  These systems can replace human experts in their narrow, precise, domains of expertise
  • 8. IA / 4 Uncertainty 8 The MYCIN Expert System  An AI program for diagnosis and therapy selection for bacterial infections of the blood  Provide advice to non-expert physicians with time considerations and incomplete evidence  Example MYCIN rule: if stain of organism is gramneg and morphology is rod and aerobicity is aerobic then strongly suggestive (0.8) that organism is enterobacteriaceae.
  • 9. IA / 4 Uncertainty 9 MYCIN Architecture Consultation System Explanation System Knowledge Acquisition System Q-A System Dynamic DB Patient Data Context Tree Dynamic Data Static DB Rules Parameter Properties Context Type Properties Tables, Lists Physician Expert
  • 10. IA / 4 Uncertainty 10 Knowledge and Reasoning process  MYCIN domain-specific knowledge in over 450 rules  Each rule is completely modular, all relevant context is contained in the rule with explicitly stated premises  Inexact reasoning uses certainty factors (CFs):  Facts and rules have assigned confidences, from 0 to 1.  Truth of a hypothesis is measured by a sum of the CFs  Positive sum is confirming evidence  Negative sum is disconfirming evidence  MYCIN handled uncertainty. To an extent.
  • 11. IA / 4 Uncertainty 11
  • 13. IA / 4 Uncertainty 13 Different problems to consider  Imprecision and accuracy  Lack of exactness or accuracy  “The result was 13.8 ± 0.3”  Vagueness  Lack of certainty or distinctness  Lack of preciseness in thought or communication  “The result was quite good”  Uncertainty  Not able to be relied on; not known or definite  Not completely confident or sure of something  “I believed that the result was 13” Precision: precisión Accuracy: exactitud
  • 14. Uncertainty in AI  Uncertainty is pervasive in “real-world” AI  AI agents suffer uncertainty in:  Perception  Representation  Reasoning  Action  In this lesson we will show a couple of methods to deal with uncertainty: probabilistic and fuzzy IA / 4 Uncertainty 14 Thought Representation
  • 15. IA / 4 Uncertainty 15 Limits of (classical) logics  Trying to use logic to cope with a real-world domain like medical diagnosis fails for three main reasons:  Laziness/time/cost: It is too much work to list the complete set of antecedents or consequents needed to ensure an exceptionless rule and too hard to use such rules.  Theoretical ignorance: Medical science –or automobile engineering– has no complete, usable, theory for the domain.  Practical ignorance: Even if we know all the rules, we might be uncertain about a particular patient because not all the necessary tests have been or can be run or have noise or …  c.f. the frame, qualification and ramification problems
  • 16. IA / 4 Uncertainty 16 Elimination of simplifying limitations  In previous lessons we have seen fully observable, deterministic, static, known environments  In these worlds the solution to a problem is a precise sequence of actions.  For AIs addressing the real world we shall relax those constraints: real world is not so neat.  Agents in the real world need to handle uncertainty, whether due to partial observability, nondeterminism, dynamism, or the existence of adversaries.
  • 17. IA / 4 Uncertainty 17 Two Classes of Uncertainty  Uncertainty –messiness in the mind of the agent– can have two origins:  Ontological: The world is messy  Epistemological: The world is neat but the agent can’t get a perfect picture of it* Representation Real-world * Neat in terms of the representation formalism
  • 18. IA / 4 Uncertainty 18 Ontological Epistemological a dimly seen rabbit a Schrödinger cat
  • 19. IA / 4 Uncertainty 19 Copilot, please make a poem about a dimly seen rabbit using quantum mechanics terms In the twilight's gentle embrace, A rabbit's form, a fleeting trace, In quantum realms, it hops and plays, In superposition, it spends its days. With wavefunctions that softly sway, Its path uncertain, it cannot stay, Entangled with the moonlit night, A particle of pure delight. Through Heisenberg's uncertain veil, Its position and momentum frail, A quantum leap, a boundless flight, In the dimly seen, it finds its light. In Schrödinger's box, it might reside, Both here and there, it does not hide, A paradox in every hop, In quantum fields, it cannot stop. So let us ponder, let us dream, Of rabbits in a quantum scheme, Where physics and the night entwine, In a dance both wondrous and divine. “Through Heisenberg's uncertain veil”
  • 20. IA / 4 Uncertainty 20 We’ll see two strategies  Probabilistic methods  Deal with epistemological uncertainty  Fuzzy methods  Deal with ontological uncertainty  Other:  Necessity/Possibility theory  Depmster/Shafer belief functions theory  Plaussibility theory  Qualitative / Interval logics  …
  • 22. IA / 4 Uncertainty 22 Dealing with degrees of belief  In general, when there is a connection between real- world facts (cause  effect, if P then Q), there are no strict logical consequences in other directions. The world is complex (if P R ∧ S ∧ T ∧ … ∧ then Q).  the qualification problem  Rain  Wet road  Wet road  Rain I notice the road is wet. Could it have rained?
  • 23. IA / 4 Uncertainty 23 Knowledge and Belief  Philosophers said that knowledge is “justified, true, belief”; however, …  The agent’s “knowledge” can at best provide only a degree of belief in the relevant sentences.  Our main tool for dealing with degrees of belief is probability theory.  Probability theory talks also about “possible worlds”.
  • 24. IA / 4 Uncertainty 24 Probability theory commitments  Probability provides a way of summarizing the uncertainty that comes from our laziness and ignorance, thereby “solving” the qualification problem.  Commitments in logic and probability:  The ontological commitments are the same—that the world is composed of facts that hold or not.  The epistemological commitments are different: a logical agent believes each sentence to be {true, false, no opinion} whereas a probabilistic agent may have a numerical degree of belief between 0 (for sentences that are certainly false) and 1 (certainly true).
  • 25. IA / 4 Uncertainty 25 Fundamental concepts in probability  Basics of probability:  Random Variables  Joint and Marginal Distributions  Conditional Distribution  Product Rule, Chain Rule, Bayes’ Rule  Inference  Independence  Uses: Bayesian networks, Markov chains, etc.  This is basic probability stuff (statistics) that is used a lot in current AI and robotics methods
  • 26. IA / 4 Uncertainty 26 Probabilitstic Inference in Ghostbusters  A ghost is hidden in the grid somewhere  Sensor readings tell how close a square is to the ghost  On the ghost: red  1 or 2 away: orange  3 or 4 away: yellow  5+ away: green  If sensors are noisy, we may knowP(Color | Distance) P(red | 3) P(orange | 3) P(yellow | 3) P(green | 3) 0.05 0.15 0.5 0.3
  • 27. IA / 4 Uncertainty 27 Dealing with Uncertainty  General situation:  Observed variables (evidence): Agent knows certain things about the state of the world (e.g., sensor readings or symptoms)  Unobserved variables: Agent needs to reason about other aspects (e.g. where an object is or what disease is present)  Model: Agent knows something about how the known variables relate to the unknown variables  Probabilistic reasoning gives us a framework for managing our beliefs (e.g. with new info)
  • 28. IA / 4 Uncertainty 28 Random Variables  A random variable is some aspect of the world about which we (may) have uncertainty  R = Is it raining?  T = Is it hot or cold?  D = How long will it take to drive to work?  L = Where is the ghost?  Random variables have ranges (maybe non-numeric)  R in {true, false} (often writen as {+r, -r})  T in {hot, cold}  D in [0, )  L in possible locations, maybe {(0,0), (0,1), …} We denote random variables with upper case letters: R,T,…
  • 29. IA / 4 Uncertainty 29 Probability Distributions Random variables have probability distributions Associate a probability with each value in the range T P hot 0.5 cold 0.5 W P sun 0.6 rain 0.1 fog 0.3 meteor 0.0 Temperature Weather A probability value for each “possible world”
  • 30. IA / 4 Uncertainty 30 Ranges may be multidimensional  A two dimensional normal distribution Ranges may also be discrete or continuous
  • 31. IA / 4 Uncertainty 31 Shorthand notation: Probability Distributions  A distribution is a function from values to probabilities.  A probability (lower case) is a single number p =  Axioms for a valid “probability distribution”:
  • 32. IA / 4 Uncertainty 32 Joint Distributions  A joint distribution over a set of random variables specifies a real number for each tuple (or outcome):  Must obey: T W P hot sun 0.4 hot rain 0.1 cold sun 0.2 cold rain 0.3 A probabilistic model is a joint distribution
  • 33. IA / 4 Uncertainty 33 Events  An event is a set E of outcomes  From a joint distribution, we can calculate the probability of any event  Probability that it’s hot AND sunny? 0.4  Probability that it’s hot? 0.4+0.1 = 0.5  Probability that it’s hot OR sunny? 0.4+0.1+0.2=0.7  Typically, the events we care about are partial assignments, like P(W=sunny) T W P hot sun 0.4 hot rain 0.1 cold sun 0.2 cold rain 0.3 “W=sunny” event P(W=sunny) = 0.6
  • 34. IA / 4 Uncertainty 34 Quiz: Events  P(+x AND +y) ?  P(+x) ?  P(-y OR +x) ? X Y P +x +y 0.2 +x -y 0.3 -x +y 0.4 -x -y 0.1
  • 35. IA / 4 Uncertainty 35 Marginal Distributions  Marginal distributions are sub-tables which eliminate variables  Marginalization (summing out): Combine collapsed rows by adding T W P hot sun 0.4 hot rain 0.1 cold sun 0.2 cold rain 0.3 T P hot 0.5 cold 0.5 W P sun 0.6 rain 0.4
  • 36. IA / 4 Uncertainty 36 Quiz: Compute Marginal Distributions X Y P +x +y 0.2 +x -y 0.3 -x +y 0.4 -x -y 0.1 X P +x -x Y P +y -y
  • 37. IA / 4 Uncertainty 37 Quiz: Compute Marginal Distributions X Y P +x +y 0.2 +x -y 0.3 -x +y 0.4 -x -y 0.1 X P +x 0.5 -x 0.5 Y P +y 0.6 -y 0.4
  • 38. IA / 4 Uncertainty 38 Conditional Probabilities  Conditional probability is the probability of a, given b  What is the probability of “sun” if it is “cold”?  What is the probability of a sunny day being cold (P(cold|sunny))?  This is the definition of a conditional probability  This is a simple relation between joint, conditional and marginal probabilities b is called “evidence”
  • 39. Conditional Probabilities  Use the simple relation between joint and conditional probabilities (probability of a, given b) taken from the definition of conditional probability IA / 4 Uncertainty 39 T W P hot sun 0.4 hot rain 0.1 cold sun 0.2 cold rain 0.3 What is the probability of “sun” if it is “cold”?
  • 40. Quiz: Conditional Probabilities  P(+x | +y) ?  P(-x | +y) ? IA / 4 Uncertainty 40 X Y P +x +y 0.2 +x -y 0.3 -x +y 0.4 -x -y 0.1
  • 41. Conditional Distributions  Conditional distributions are probability distributions over some variables given fixed values of others IA / 4 Uncertainty 41 T W P hot sun 0.4 hot rain 0.1 cold sun 0.2 cold rain 0.3 W P sun 0.8 rain 0.2 W P sun 0.4 rain 0.6 Conditional Distributions Joint Distribution
  • 42. IA / 4 Uncertainty 42 Normalization Trick (scale to 1.0) T W P hot sun 0.4 hot rain 0.1 cold sun 0.2 cold rain 0.3 W P sun 0.4 rain 0.6
  • 43. IA / 4 Uncertainty 43 SELECT the joint probabilities matching the evidence (T=cold) Normalization Trick T W P hot sun 0.4 hot rain 0.1 cold sun 0.2 cold rain 0.3 W P sun 0.4 rain 0.6 T W P cold sun 0.2 cold rain 0.3 NORMALIZE the selection (make it sum 1.0)
  • 44. IA / 4 Uncertainty 44 Quiz: Normalization Trick X Y P +x +y 0.2 +x -y 0.3 -x +y 0.4 -x -y 0.1 SELECT the joint probabilities matching the evidence NORMALIZE the selection (make it sum 1.0)  P(X | Y=-y) ?
  • 45. IA / 4 Uncertainty 45 Independence  Two random variables are said to be independent if they do not condition each other: P(a|b)=P(a) P(b|a)=P(b) P(a∧b)=P(a)P(b)  When it is available, independence can help in reducing the size of the domain representation and the complexity of the inference problem.
  • 46. Probabilistic Inference  Probabilistic inference is the computation of a specific probability from other, known probabilities (e.g. conditional from joint)  We generally compute conditional probabilities  P(on time | no reported accidents) = 0.90  These represent the agent’s beliefs given the evidence  Probabilities change with new evidence:  P(on time | no accidents, 5 a.m.) = 0.95  P(on time | no accidents, 5 a.m., raining) = 0.80  Observing new evidence causes beliefs to be updated IA / 4 Uncertainty 46
  • 47. Probabilistic Inference Rules  From the definition of a conditional probability  We can derive some inference rules:  The product rule  The chain rule  Bayes’ rule IA / 4 Uncertainty 47
  • 48. The Product Rule  Compute joint distributions from conditional and marginal distributions IA / 4 Uncertainty 48
  • 49. IA / 4 Uncertainty 49 Quiz: The Product Rule R P sun 0.8 rain 0.2 D W P wet sun 0.1 dry sun 0.9 wet rain 0.7 dry rain 0.3 D W P wet sun 0.08 dry sun 0.72 wet rain 0.14 dry rain 0.06 0.08 0.72 0.14 0,06 Sum = 1.0
  • 50. The Chain Rule  Compute joint distribution as an incremental product of conditional distributions IA / 4 Uncertainty 50
  • 51. IA / 4 Uncertainty 51 The chain rule P(6, 6, 6)? P(J, J, J)?
  • 52. Bayes’ Rule  There are two ways to factor a joint distribution over two variables:  Dividing, we get Bayes rule:  Why is this at all useful?  Derive one conditional probability from its reverse (often one conditional is tricky but the other one is simpler, e.g. due to causality).  Foundation of many probabilistic inference systems IA / 4 Uncertainty 52
  • 53. Inference with Bayes’ Rule  Example: Computing diagnostic probability from causal probability (cause  effect)  Meningitis causes stiff neck (p=0.8)  I have a stiff neck. Should I worry?  What is the probability of having meningitis if I have a stiff neck? P(+m|+s) IA / 4 Uncertainty 53
  • 54. Inference with Bayes’ Rule  Example: Diagnostic probability from causal probability:  Example:  M: meningitis, S: stiff neck  Note: posterior probability of meningitis still very small  Note: you should still get stiff necks checked out! Why? IA / 4 Uncertainty 54 Known data = 0.007944 What is the probability of having meningitis if I have a sti neck? P(+m|+s)
  • 55. IA / 4 Uncertainty 55 Quiz: Bayes’ Rule  Given:  What is P(W | dry) ? R P sun 0.8 rain 0.2 D W P wet sun 0.1 dry sun 0.9 wet rain 0.7 dry rain 0.3
  • 56. IA / 4 Uncertainty 56 Uses of probability  Probabilistic reasoning:  Bayesian Networks  Probabilistic reasoning over time:  Markov Models  Decision making:  Utility functions, single and multiobjective Decision theory = probability theory + utility theory  Markov Decision Processes  Multiagent decision making:  Game theory Mathematics Statistics Economy
  • 57. IA / 4 Uncertainty 57 Bayesian networks  Bayesian networks are a way to represent probabilistic relationships to capture uncertain knowledge.  They can be used to perform probabilistic inference:  Exact probabilistic inference is computationally intractable in the worst case, but can be used in many practical situations.  If not, approximate inference algorithms are often applicable.  Other names for Bayesian networks:  They were called belief networks in the 1980s and 1990s.  A causal network is a Bayesian network with additional constraints on the meaning of the arrows.  Graphical models are a broader class that includes Bayesian networks.
  • 58. IA / 4 Uncertainty 58 Bayesian networks  A typical Bayesian network, showing both the topology and the conditional probability tables (CPTs).
  • 59. End of Lesson Uncertainty Pt.1 Probabilistic Methods

Editor's Notes

  • #25: Like “sort this list” or “add these two numbers”