SlideShare a Scribd company logo
2
Most read
3
Most read
18
Most read
BAYESIAN
NETWORKS
BY-
Ahmad Ali Al Taweel
CONDITIONAL PROBABILITY
Events A and B
• P(A|B)- Probability that event A occurs given that event B has already
occurred.
Example:
There are 2 baskets. B1 has 2 red ball and 5 blue ball. B2 has 4 red ball and 3
blue ball. Find probability of picking a red ball from basket 1?
• The question above wants P(red ball | basket 1).
• The answer intuitively wants the probability of red ball from only the sample
space of basket 1.
• So the answer is 2/7
• The equation to solve it is:
P(A|B) = P(A∩B)/P(B) [Product Rule]
P(A,B) = P(A)*P(B) [ If A and B are independent ]
How do you solve P(basket2 | red ball) ???
BAYESIAN THEOREM
• A special case of
Bayesian Theorem:
P(A∩B) = P(B) x P(A|B)
P(B∩A) = P(A) x P(B|A)
Since P(A∩B) = P(B∩A),
P(B) x P(A|B) = P(A) x
P(B|A)
=> P(A|B) = [P(A) x P(B|A)]
/ P(B)
A B
       ABPAPABPAP
ABPAP
BP
ABPAP
BAP
||
)|()(
)(
)|()(
)|(


BAYESIAN THEOREM
Solution to P(basket2 | red ball) ?
P(basket 2| red ball) = [P(b2) x P(r | b2)] / P(r)
= (1/2) x (4/7)] / (6/14)
= 0.66
• A Bayesian network is a graph-based model for conditional
independence assertions and hence for compact specification
of full joint distributions.
• A Bayesian network B is defined as a pair B = (G, P), where G
= (V (G),A(G)) is an acyclic directed graph with a set of
vertices (or nodes) V (G) = {X1,X2, . . . ,Xn}
• And a set of arcs A(G) ⊆ V (G) × V (G), and where P is a joint
probability distribution defined on the variables
corresponding to the vertices V (G).
Bayesian Network
A Bayesian Network consists of [Jensen, 1996]:
A set of variables and a set of direct edges between
variables.
1. Each variables has a finite set of mutually exclusive
states
2. The variable and direct edge form a DAG (directed
acyclic graph)
3. To each variable A with parents B1, B2 ..Bn there is
attached a conditional probability table P(A| B1, B2 ..
Bn)
p(A,B,C) = p(C|A,B)p(A)p(B)
A B
C
FORMS OF THE BAYESIAN NETWORKS
A CB Marginal Independence:
p(A,B,C) = p(A) p(B) p(C)
A Directed Acyclic Graph
A
CB
Conditionally independent effects:
p(A,B,C) = p(B|A)p(C|A)p(A)
B and C are conditionally independent
Given A
Each node in these graphs is a
random variable
Informally, an arrow from node X to
node Y means X has a direct
influence on Y
A CB
Markov dependence:
p(A,B,C) = p(C|B) p(B|A)p(A)
A node X is a parent of another node Y
if there is an arrow from node X to
node Y eg. A is a parent of B & C
P(A,B,C,D,E,F) = P(F|C,D,E)P(A,B,C,D,E)
= P(F|C,D,E)P(C|A,E)P(D|B)P(E|B)P(B,A)
= P(F|C,D,E)P(C|A,E)P(D|B)P(E|B)P(B|A)P(A)
EXAMPLE
 Learning phase
 Testing phase (inference)
Bayesian network
Weng-Keen Wong, Oregon
State University ©2005
14
Using a Bayesian Network Example
Using the network in the example, suppose you want to
calculate:
P(A = true, B = true, C = true, D = true)
= P(A = true) * P(B = true | A = true) *
P(C = true | B = true) P( D = true | B = true)
= (0.4)*(0.3)*(0.1)*(0.95) A
B
C D
Weng-Keen Wong, Oregon
State University ©2005
15
Using a Bayesian Network Example
Using the network in the example, suppose you want to
calculate:
P(A = true, B = true, C = true, D = true)
= P(A = true) * P(B = true | A = true) *
P(C = true | B = true) P( D = true | B = true)
= (0.4)*(0.3)*(0.1)*(0.95) A
B
C D
This is from the
graph structure
These numbers are from the
conditional probability tables
Weng-Keen Wong, Oregon
State University ©2005
16
Bayesian Networks
Two important properties:
1. Encodes the conditional independence
relationships between the variables in the graph
structure
2. Is a compact representation of the joint
probability distribution over the variables
Why Bayesian Networks?
• — Bayesian Probability represents the degree of
belief in that event while Classical Probability (or
frequents approach) deals with true or physical
probability of an event Bayesian Network
Handling of Incomplete Data Sets Learning about
Causal Networks Facilitating the combination of
domain knowledge and data Efficient and
principled approach for avoiding the over fitting of
data.
CONCLUSION
• Bayesian nets are a network-based framework for
representing and analyzing models involving uncertainty
• Used for the cross fertilization of ideas between the artificial
intelligence, decision analysis, and statistic communities
• People are using this nowadays because of the development
of propagation algorithms followed by availability of easy to
use commercial software.
• And growing number of creative applications.

More Related Content

PDF
Bayesian networks
PDF
Bayesian Networks - A Brief Introduction
PDF
Bayes Belief Networks
ODP
NAIVE BAYES CLASSIFIER
PPTX
Support vector machines (svm)
PDF
Support Vector Machines ( SVM )
PDF
Classification Based Machine Learning Algorithms
PPTX
Support vector machine-SVM's
Bayesian networks
Bayesian Networks - A Brief Introduction
Bayes Belief Networks
NAIVE BAYES CLASSIFIER
Support vector machines (svm)
Support Vector Machines ( SVM )
Classification Based Machine Learning Algorithms
Support vector machine-SVM's

What's hot (20)

PPTX
Decision Tree - ID3
PDF
Lecture 4 Decision Trees (2): Entropy, Information Gain, Gain Ratio
PPTX
Naïve Bayes Classifier Algorithm.pptx
PPT
Data Mining: Concepts and Techniques_ Chapter 6: Mining Frequent Patterns, ...
PPT
K mean-clustering algorithm
PDF
Bias and variance trade off
PDF
Lecture13 - Association Rules
PPTX
Logistic Regression | Logistic Regression In Python | Machine Learning Algori...
PDF
Bayesian inference
PPTX
Naive bayes
PPTX
lazy learners and other classication methods
PPTX
Bruteforce algorithm
PPT
3. mining frequent patterns
PPTX
K means clustering
PPT
DESIGN AND ANALYSIS OF ALGORITHMS
PDF
Naive Bayes Classifier
ODP
Machine Learning With Logistic Regression
PPTX
APRIORI ALGORITHM -PPT.pptx
PDF
Introduction to Machine Learning Classifiers
PPT
Classification (ML).ppt
Decision Tree - ID3
Lecture 4 Decision Trees (2): Entropy, Information Gain, Gain Ratio
Naïve Bayes Classifier Algorithm.pptx
Data Mining: Concepts and Techniques_ Chapter 6: Mining Frequent Patterns, ...
K mean-clustering algorithm
Bias and variance trade off
Lecture13 - Association Rules
Logistic Regression | Logistic Regression In Python | Machine Learning Algori...
Bayesian inference
Naive bayes
lazy learners and other classication methods
Bruteforce algorithm
3. mining frequent patterns
K means clustering
DESIGN AND ANALYSIS OF ALGORITHMS
Naive Bayes Classifier
Machine Learning With Logistic Regression
APRIORI ALGORITHM -PPT.pptx
Introduction to Machine Learning Classifiers
Classification (ML).ppt
Ad

Similar to Bayesian network (20)

PPT
Bayesian Belief Network (BBN) Bayesian Belief Network (BBN) Bayesian Belief N...
PPT
ch15BayesNet.ppt
PPT
Bayesian Networks Model in Step By Steps
PPTX
BSM with Sofware package for Social Sciences
PPT
8-Sets-2.ppt
PDF
Solution Strategies for Equations that Arise in Geometric (Cliff ord) Algebra
PDF
1169 dfd8297
PPT
Vector algebra
PDF
vector-algebra-ppt-160215075153.pdf
PDF
Module 3_Machine Learning Bayesian Learn
DOCX
Set theory self study material
PPT
Lesson 1 &2 -Set Theory and Functions.ppt
PPT
Probability Arunesh Chand Mankotia 2005
PPT
Sets (1).ppt
PPT
Chap12_Sec3 - Dot Product.ppt
PPTX
STATISTICS Probability PPT EXPLAINE DIN DETAIL
PDF
20200911-XI-Maths-Sets-2 of 2-Ppt.pdf
PDF
Vector Algebra One Shot #BounceBack.pdf
PPTX
Quant02
Bayesian Belief Network (BBN) Bayesian Belief Network (BBN) Bayesian Belief N...
ch15BayesNet.ppt
Bayesian Networks Model in Step By Steps
BSM with Sofware package for Social Sciences
8-Sets-2.ppt
Solution Strategies for Equations that Arise in Geometric (Cliff ord) Algebra
1169 dfd8297
Vector algebra
vector-algebra-ppt-160215075153.pdf
Module 3_Machine Learning Bayesian Learn
Set theory self study material
Lesson 1 &2 -Set Theory and Functions.ppt
Probability Arunesh Chand Mankotia 2005
Sets (1).ppt
Chap12_Sec3 - Dot Product.ppt
STATISTICS Probability PPT EXPLAINE DIN DETAIL
20200911-XI-Maths-Sets-2 of 2-Ppt.pdf
Vector Algebra One Shot #BounceBack.pdf
Quant02
Ad

More from Ahmad El Tawil (18)

PPTX
Force sensors presentation
PPTX
Enabling Reusable and Adaptive Modeling,Provisioning & Execution of BPEL Proc...
PPTX
Map reduce presentation
DOCX
Map reduce advantages over parallel databases report
PPTX
Map reduce advantages over parallel databases
DOCX
Cloud computing risk assesment report
PPTX
Cloud computing risk assesment
DOCX
Piper Alpha Disaster Report
PPTX
Fruit detection using morphological
PPTX
Piper Alpha Disaster
PPTX
Cloud computing risk assesment presentation
PPTX
Bhopal Disaster Presentation
PPTX
Security algorithms for manet
PPTX
AAA Implementation
PPTX
5G green communication
PPTX
A survey of ethical hacking process and security
PPTX
PPTX
Cybercriminals focus on Cryptocurrency
Force sensors presentation
Enabling Reusable and Adaptive Modeling,Provisioning & Execution of BPEL Proc...
Map reduce presentation
Map reduce advantages over parallel databases report
Map reduce advantages over parallel databases
Cloud computing risk assesment report
Cloud computing risk assesment
Piper Alpha Disaster Report
Fruit detection using morphological
Piper Alpha Disaster
Cloud computing risk assesment presentation
Bhopal Disaster Presentation
Security algorithms for manet
AAA Implementation
5G green communication
A survey of ethical hacking process and security
Cybercriminals focus on Cryptocurrency

Recently uploaded (20)

PDF
Pre independence Education in Inndia.pdf
PDF
FourierSeries-QuestionsWithAnswers(Part-A).pdf
PDF
3rd Neelam Sanjeevareddy Memorial Lecture.pdf
PPTX
The Healthy Child – Unit II | Child Health Nursing I | B.Sc Nursing 5th Semester
PPTX
Introduction_to_Human_Anatomy_and_Physiology_for_B.Pharm.pptx
PDF
01-Introduction-to-Information-Management.pdf
PDF
102 student loan defaulters named and shamed – Is someone you know on the list?
PPTX
Pharma ospi slides which help in ospi learning
PDF
Chapter 2 Heredity, Prenatal Development, and Birth.pdf
PPTX
Pharmacology of Heart Failure /Pharmacotherapy of CHF
PDF
Module 4: Burden of Disease Tutorial Slides S2 2025
PDF
TR - Agricultural Crops Production NC III.pdf
PDF
Classroom Observation Tools for Teachers
PPTX
Final Presentation General Medicine 03-08-2024.pptx
PPTX
Week 4 Term 3 Study Techniques revisited.pptx
PDF
Supply Chain Operations Speaking Notes -ICLT Program
PDF
Complications of Minimal Access Surgery at WLH
PPTX
Cell Structure & Organelles in detailed.
PPTX
master seminar digital applications in india
PPTX
school management -TNTEU- B.Ed., Semester II Unit 1.pptx
Pre independence Education in Inndia.pdf
FourierSeries-QuestionsWithAnswers(Part-A).pdf
3rd Neelam Sanjeevareddy Memorial Lecture.pdf
The Healthy Child – Unit II | Child Health Nursing I | B.Sc Nursing 5th Semester
Introduction_to_Human_Anatomy_and_Physiology_for_B.Pharm.pptx
01-Introduction-to-Information-Management.pdf
102 student loan defaulters named and shamed – Is someone you know on the list?
Pharma ospi slides which help in ospi learning
Chapter 2 Heredity, Prenatal Development, and Birth.pdf
Pharmacology of Heart Failure /Pharmacotherapy of CHF
Module 4: Burden of Disease Tutorial Slides S2 2025
TR - Agricultural Crops Production NC III.pdf
Classroom Observation Tools for Teachers
Final Presentation General Medicine 03-08-2024.pptx
Week 4 Term 3 Study Techniques revisited.pptx
Supply Chain Operations Speaking Notes -ICLT Program
Complications of Minimal Access Surgery at WLH
Cell Structure & Organelles in detailed.
master seminar digital applications in india
school management -TNTEU- B.Ed., Semester II Unit 1.pptx

Bayesian network

  • 2. CONDITIONAL PROBABILITY Events A and B • P(A|B)- Probability that event A occurs given that event B has already occurred. Example: There are 2 baskets. B1 has 2 red ball and 5 blue ball. B2 has 4 red ball and 3 blue ball. Find probability of picking a red ball from basket 1? • The question above wants P(red ball | basket 1). • The answer intuitively wants the probability of red ball from only the sample space of basket 1. • So the answer is 2/7 • The equation to solve it is: P(A|B) = P(A∩B)/P(B) [Product Rule] P(A,B) = P(A)*P(B) [ If A and B are independent ] How do you solve P(basket2 | red ball) ???
  • 3. BAYESIAN THEOREM • A special case of Bayesian Theorem: P(A∩B) = P(B) x P(A|B) P(B∩A) = P(A) x P(B|A) Since P(A∩B) = P(B∩A), P(B) x P(A|B) = P(A) x P(B|A) => P(A|B) = [P(A) x P(B|A)] / P(B) A B        ABPAPABPAP ABPAP BP ABPAP BAP || )|()( )( )|()( )|(  
  • 4. BAYESIAN THEOREM Solution to P(basket2 | red ball) ? P(basket 2| red ball) = [P(b2) x P(r | b2)] / P(r) = (1/2) x (4/7)] / (6/14) = 0.66
  • 5. • A Bayesian network is a graph-based model for conditional independence assertions and hence for compact specification of full joint distributions. • A Bayesian network B is defined as a pair B = (G, P), where G = (V (G),A(G)) is an acyclic directed graph with a set of vertices (or nodes) V (G) = {X1,X2, . . . ,Xn} • And a set of arcs A(G) ⊆ V (G) × V (G), and where P is a joint probability distribution defined on the variables corresponding to the vertices V (G). Bayesian Network
  • 6. A Bayesian Network consists of [Jensen, 1996]: A set of variables and a set of direct edges between variables. 1. Each variables has a finite set of mutually exclusive states 2. The variable and direct edge form a DAG (directed acyclic graph) 3. To each variable A with parents B1, B2 ..Bn there is attached a conditional probability table P(A| B1, B2 .. Bn)
  • 7. p(A,B,C) = p(C|A,B)p(A)p(B) A B C FORMS OF THE BAYESIAN NETWORKS A CB Marginal Independence: p(A,B,C) = p(A) p(B) p(C) A Directed Acyclic Graph
  • 8. A CB Conditionally independent effects: p(A,B,C) = p(B|A)p(C|A)p(A) B and C are conditionally independent Given A Each node in these graphs is a random variable Informally, an arrow from node X to node Y means X has a direct influence on Y A CB Markov dependence: p(A,B,C) = p(C|B) p(B|A)p(A)
  • 9. A node X is a parent of another node Y if there is an arrow from node X to node Y eg. A is a parent of B & C P(A,B,C,D,E,F) = P(F|C,D,E)P(A,B,C,D,E) = P(F|C,D,E)P(C|A,E)P(D|B)P(E|B)P(B,A) = P(F|C,D,E)P(C|A,E)P(D|B)P(E|B)P(B|A)P(A)
  • 12.  Testing phase (inference)
  • 14. Weng-Keen Wong, Oregon State University ©2005 14 Using a Bayesian Network Example Using the network in the example, suppose you want to calculate: P(A = true, B = true, C = true, D = true) = P(A = true) * P(B = true | A = true) * P(C = true | B = true) P( D = true | B = true) = (0.4)*(0.3)*(0.1)*(0.95) A B C D
  • 15. Weng-Keen Wong, Oregon State University ©2005 15 Using a Bayesian Network Example Using the network in the example, suppose you want to calculate: P(A = true, B = true, C = true, D = true) = P(A = true) * P(B = true | A = true) * P(C = true | B = true) P( D = true | B = true) = (0.4)*(0.3)*(0.1)*(0.95) A B C D This is from the graph structure These numbers are from the conditional probability tables
  • 16. Weng-Keen Wong, Oregon State University ©2005 16 Bayesian Networks Two important properties: 1. Encodes the conditional independence relationships between the variables in the graph structure 2. Is a compact representation of the joint probability distribution over the variables
  • 17. Why Bayesian Networks? • — Bayesian Probability represents the degree of belief in that event while Classical Probability (or frequents approach) deals with true or physical probability of an event Bayesian Network Handling of Incomplete Data Sets Learning about Causal Networks Facilitating the combination of domain knowledge and data Efficient and principled approach for avoiding the over fitting of data.
  • 18. CONCLUSION • Bayesian nets are a network-based framework for representing and analyzing models involving uncertainty • Used for the cross fertilization of ideas between the artificial intelligence, decision analysis, and statistic communities • People are using this nowadays because of the development of propagation algorithms followed by availability of easy to use commercial software. • And growing number of creative applications.