SlideShare a Scribd company logo
Mutually and
Non
Mutually
Exclusive
Theorems of Probability
 There are 2 important theorems of
probability which are as follows:
2
The Addition Theorem and
The Multiplication Theorem
Addition theorem when events are Mutually
Exclusive
 Definition: - It states that if 2 events A and B are mutually exclusive then the probability of the occurrence of either A or B is the sum
of the individual probability of A and B.
 Symbolically
3
P(A or B) or P(A U B) = P(A) + P(B)
P(A or B or C) = P(A) + P(B) + P(C)
The theorem can be extended to three or more mutually
exclusive events. Thus,
Addition theorem when events are not Mutually
Exclusive (Overlapping or Intersection Events)
 Definition: - It states that if 2 events A and B are not mutually exclusive then the probability of
the occurrence of either A or B is the sum of the individual probability of A and B minus the
probability of occurrence of both A and B.
 Symbolically
4
P(A or B) or P(A U B) = P(A) + P(B) – P(A ∩ B)
Mutually Exclusive Events
Two events are mutually exclusive if
they cannot occur at the same time
(i.e., they have no outcomes in
common).
In the Venn Diagram above,
the probabilities of events A
and B are represented by
two disjoint sets (i.e., they
have no elements in
common).
Non-Mutually Exclusive Events
Two events are non-mutually exclusive if
they have one or more outcomes in
common.
In the Venn Diagram above, the
probabilities of events A and B
are represented by two
intersecting sets (i.e., they have
some elements in common).
The Addition Rule: Mutually Exclusive
P(A or B) = P(A) + P(B)
The Addition Rule: Non-mutually Exclusive
P(A or B) = P(A)+P(B) - P(A and B)
Probability of A and B
happening together
Probability of B
happening
Probability of A
happening
Probability of A or B
happening when and B are
not Mutually exclusive
Probability of either A or B happening
Multiplication theorem
 Definition: States that if 2 events A and B are
independent, then the probability of the occurrence
of both of them (A & B) is the product of the individual
probability of A and B.
 Symbolically,
Probability of happening of both the events:
P(A and B) or P(A ∩ B) = P(A) x P(B)
P(A, B and C) or P(A ∩ B ∩ C) = P(A) x P(B) x P(C)
Theorem can be extended to 3 or more independent events.
Thus,
How to calculate probability in case of
Dependent Events
Case Formula
1. Probability of occurrence of at least A or B
1. When events are mutually
2. When events are not mutually exclusive
2. Probability of occurrence of both A & B
3. Probability of occurrence of A & not B
4. Probability of occurrence of B & not A
5. Probability of non-occurrence of both A & B
6. Probability of non-occurrence of atleast A or B
P(A U B) = P(A) + P(B)
P(A U B) = P(A) + P(B) – P(A ∩ B)
P(A ∩ B) = P(A) + P(B) – P(A U B)
P(A ∩ B) = P(A) - P(A ∩ B)
P(A ∩ B) = P(B) - P(A ∩ B)
P(A ∩ B) = 1 - P(A U B)
P(A U B) = 1 - P(A ∩ B)
8
How to calculate probability in case of
Independent Events
Case Formula
1. Probability of occurrence of both A & B
2. Probability of non-occurrence of both A
& B
3. Probability of occurrence of A & not B
4. Probability of occurrence of B & not A
5. Probability of occurrence of atleast one
event
6. Probability of non-occurrence of atleast
one event
7. Probability of occurrence of only one
event
P(A ∩ B) = P(A) x P(B)
P(A ∩ B) = P(A) x P(B)
P(A ∩ B) = P(A) x P(B)
P(A ∩ B) = P(A) x P(B)
P(A U B) = 1 - P(A ∩ B) = 1 – [P(A) x P(B)]
P(A U B) = 1 - P(A ∩ B) = 1 – [P(A) x P(B)]
P(A ∩ B) + P(A ∩ B) = [P(A) x P(B)] +
[P(A) x P(B)]
Problem
 An inspector of the Alaska Pipeline has the task of
comparing the reliability of 2 pumping stations.
Each station is susceptible to 2 kinds of failure: Pump
failure & leakage. When either (or both) occur, the
station must be shut down. The data at hand
indicate that the following probabilities prevail:
Station P(Pump failure)P(Leakage) P(Both)
1 0.07 0.10 0
2 0.09 0.12 0.06
Which station has the higher probability of being
shut down.
10
Solution
P(Pump failure or Leakage)
= P(Pump Failure) + P(Leakage Failure)
– P(Pump Failure ∩ Leakage Failure)
11
Station 1: = 0.07 + 0.10 – 0
= 0.17
Station 2: = 0.09 + 0.12 – 0.06
= 0.15
Thus, station 1 has the higher
probability of being shut down.
Probability Rules
Probabilities under conditions of
Statistical Independence
 Statistically Independent Events: - The
occurrence of one event has no effect on the
probability of the occurrence of any other
event
Most managers who use probabilities are
concerned with 2 conditions.
1. The case where one event or another will occur.
2. The situation where 2 or more. Events will both occur.
 There are 3 types of probabilities under
statistical independence.
Marginal
Joint
Conditional
Marginal/ Unconditional Probability:
- A single probability where only one event can take
place.
.
Joint probability:
- Probability of 2 or more events occurring together or in
succession.
Conditional probability:
- Probability that a second event (B) will occur if a first
event (A) has already happened
Example: Marginal Probability - Statistical Independence
 A single probability where only one event
can take place.
Marginal Probability of an Event
P(A) = P(A)
Example 1: - On each individual toss of an biased or unfair
coin, P(H) = 0.90 & P(T) = 0.10. The outcomes of several
tosses of this coin are statistically independent events too,
even tough the coin is biased.
Example 2: - 50 students of a school drew lottery to see
which student would get a free trip to the Carnival at Goa.
Any one of the students can calculate his/ her chances of
winning as:
P(Winning) = 1/50 = 0.02
Example: Joint Probability - Statistical Independence
 The probability of 2 or more independent events
occurring together or in succession is the product of their
marginal probabilities.
Joint Probability of 2 Independent Events
P(AB) = P(A) * P(B)
Example: - What is the probability of heads on 2
successive tosses?
P(H1H2) = P(H1) * P(H2)
= 0.5 * 0.5 = 0.25
The probability of heads on 2 successive tosses is
0.25, since the probability of any outcome is not
affected by any preceding outcome.
 We can make the probabilities of events even more
explicit using a Probabilistic Tree.
1 Toss 2 Toss 3 Toss
H1 0.5 H1H2 0.25 H1H2H3 0.125
T1 0.5 H1T2 0.25 H1H2T3 0.125
T1H2 0.25 H1T2H3 0.125
T1T2 0.25 H1T2T3 0.125
T1H2H3 0.125
T1H2T3 0.125
T1T2H3 0.125
T1T2T3 0.125
Example: Conditional Probability - Statistical Independence
 For statistically independent events, conditional
probability of event B given that event A has occurred is
simply the probability of event B.
Conditional Probability for 2 Independent Events
P(B|A) = P(B)
Example: - What is the probability that the second toss
of a fair coin will result in heads, given that heads
resulted on the first toss?
P(H2|H1) = 0.5
For 2 independent events, the result of the first toss
have absolutely no effect on the results of the second toss.
Probabilities under conditions of Statistical
Dependence
 Statistical Dependence exists when the probability
of some event is dependent on or affected by the
occurrence of some other event.
 The types of probabilities under statistical
dependence are:
• Marginal
• Joint
• Conditional
Example
 Assume that a box contains 10 balls distributed as follows: -
 3 are colored & dotted
 1 is colored & striped
 2 are gray & dotted
 4 are gray & striped
Event Probability of Event
1 0.1
Colored & Dotted
2 0.1
3 0.1
4 0.1 Colored & Striped
5 0.1
Gray & Dotted
6 0.1
7 0.1
Gray & Striped
8 0.1
9 0.1
10 0.1
Example: Marginal Probability - Statistically Dependent
 It can be computed by summing up all the joint events
in which the simple event occurs.
 Compute the marginal probability of the event colored.
It can be computed by summing up the probabilities of
the two joint events in which colored occurred:
P(C) = P(CD) + P(CS)
= 0.3 + 0.1
= 0.4
Example: Joint Probability - Statistically Dependent
 Joint probabilities under conditions of statistical
dependence is given by
Joint probability for Statistically Dependent Events
P(BA) = P(B|A) * P(A)
•What is the probability that this ball is dotted and
colored?
Probability of colored & dotted balls =
P(DC) = P(D|C) * P(D)
= (0.3/0.4) * 0.5
= 0.375
Example: Conditional Probability - Statistically
Dependent
 Given A & B to be the 2 events then,
Conditional probability for Statistically Dependent Events
P(BA)
P(B|A) = ----------
P(A)
Probability of event B given that event has occurred
P(B|A)
What is the probability that this
ball is dotted, given that it is
colored?
The probability of drawing any
one of the ball from this box is
0.1 (1/10) [Total no. of balls in
the box = 10].
We know that there are 4 colored balls, 3 of which
are dotted & one of it striped.
P(DC) 0.3
P(D|C) = --------- = ------
P(C) 0.4
= 0.75
P(DC) = Probability of colored & dotted balls
(3 out of 10 --- 3/10)
P(C) = 4 out of 10 --- 4/10
Type of
probability
Symbo
l
Formula
under
statistical
Independen
ce
Formula
under
statistical
dependence
1. Marginal P(A) P(A) P(A)
2. Joint P(AB) P(A) x P(B) P(A|B) x P(B)
3.
Conditional
P(A|B) P(A) P(AB)
P(B)
Revising Prior Estimates of Probabilities:
Bayes’ Theorem
 A very important & useful application of conditional
probability is the computation of unknown
probabilities, based on past data or information.
 When an event occurs through one of the various
mutually disjoint events, then the conditional
probability that this event has occurred due to a
particular reason or event is termed as Inverse
Probability or Posterior Probability.
 Has wide ranging applications in Business & its
Management.
 Since it is a concept of revision of probability based
on some additional information, it shows the
improvement towards certainty level of the event.
 Example 1: - If a manager of a boutique finds that
most of the purple & white jackets that she thought
would sell so well are hanging on the rack, she must
revise her prior probabilities & order a different color
combination or have a sale.
 Certain probabilities were altered after the people
got additional information. New probabilities are
known as revised, or Posterior probabilities.
Bayes Theorem
 If an event A can occur only in conjunction with n
mutually exclusive & exhaustive events B1, B2, …, Bn, & if A
actually happens, then the probability that it was
preceded by an event Bi (for a conditional probabilities
of A given B1, A given B2 … A given Bn are known) & if
marginal probabilities P(Bi) are also known, then the
posterior probability of event Bi given that event A has
occurred is given by:
P(A | Bi). P(Bi)
P(Bi | A) = ----------------------
∑ P(A | Bi). P(Bi)
Remarks: -
 The probabilities P(B1), P(B2), … , P(Bn) are termed
as the ‘a priori probabilities’ because they exist
before we gain any information from the
experiment itself.
The probabilities P(A | Bi), i=1,2,…,n are called
‘Likelihoods’ because they indicate how likely the event
A under consideration is to occur, given each & every
a priori probability.
The probabilities P(Bi | A), i=1, 2, …,n are called
‘Posterior probabilities’ because they are determined
after the results of the experiment are known.
Bayes’ Formula
Problem
 In a bolt factory machines A, B, & C manufacture
respectively 25%, 35%, & 40% of the total. Of their
output 5%, 4%, 2% are defective bolts. A bolt is
drawn at random from the product & Is found to be
defective.
What are the probabilities that it was manufactured
by
machines A, B & C?
Solution
 Let E1, E2, E3 denote the events manufactured by
machines A, B & C respectively.
 Let E denote the event of its being defective.
P(E1) = 0.25; P(E2) = 0.35; P(E3) = 0.40;
Probability of drawing a defective bolt
manufactured by machine A is P(E|E1) = 0.05
Similarly P(E|E2) = 0.04; P(E|E3) = 0.02
Probability that defective bolt selected at random is
manufactured by machine A is given by
P(E1). P(E|E1)
P(E1|E) = ------------------------
∑ P(E1). P(E|E1)
i=1 to 3
0.25*0.05
= ----------------------------------------------
0.25*0.05 + 0.35*0.04 + 0.40*0.02
= 25/69
Similarly P(E2|E) = 28/69
= [(0.35*0.04)/(.25*.05+.35*.04+.40*.02)]
P(E3|E) = 16/69 =
[(0.40*0.02)/(.25*.05+.35*.04+.40*.02)]
Suppose that one person in 100, 000 has a
particular rare disease for which there is a fairly
accurate diagnostic test. This test is correct 99%
of the time when to someone with the disease;
it is correct 99.5% of the time when given to
someone who does not have the disease.
Given this information can we find
(a) the probability that someone who tests
positive for the disease has the disease?
(b) the probability that someone who tests
negative for the disease does not have the
disease?
Should someone who tests positive be very
concerned that he or she has the disease?
Glossary of terms
 Classical Probability: It is based on the idea that
certain occurrences are equally likely.
 Example: - Numbers 1, 2, 3, 4, 5, & 6 on a fair die are
each equally likely to occur.
 Conditional Probability: The probability that an event
occurs given the outcome of some other event.
 Independent Events: Events are independent if the
occurrence of one event does not affect the
occurrence of another event.
 Joint Probability: Is the likelihood that 2 or more events
will happen at the same time.
 Multiplication Formula: If there are m ways of doing
one thing and n ways of doing another thing, there are
m x n ways of doing both.
 Mutually exclusive events: A property of a set of categories
such that an individual, object, or measurement is included in
only one category.
 Objective Probability: It is based on symmetry of games of
chance or similar situations.
 Outcome: Observation or measurement of an experiment.
 Posterior Probability: A revised probability based on additional
information.
 Prior Probability: The initial probability based on the present
level of information.
 Probability: A value between 0 and 1, inclusive, describing the
relative possibility (chance or likelihood) an event will occur.
 Subjective Probability: Synonym for personal probability.
Involves personal judgment, information, intuition, & other
subjective evaluation criteria.
 Example: - A physician assessing the probability of a
patient’s recovery is making a personal judgment based
on what they know and feel about the situation.
Content, graphics and text
Content, graphics and text
belong to the rightful
belong to the rightful
owner.
owner.
No copyright intended
No copyright intended

More Related Content

PPT
Probability Concepts Applications
PPT
Probability concepts-applications-1235015791722176-2
PPTX
Probability revised
PPTX
Bba 3274 qm week 2 probability concepts
PDF
1-Probability-Conditional-Bayes.pdf
PDF
Probability (Statistics) (Experiment, Event).pdf
PPTX
introduction of probabilityChapter5.pptx
PPTX
603-probability mj.pptx
Probability Concepts Applications
Probability concepts-applications-1235015791722176-2
Probability revised
Bba 3274 qm week 2 probability concepts
1-Probability-Conditional-Bayes.pdf
Probability (Statistics) (Experiment, Event).pdf
introduction of probabilityChapter5.pptx
603-probability mj.pptx

Similar to G10 Math Q4-Week 1- Mutually Exclusive.ppt (20)

PPTX
Quarter 3 - Statistics and Probability.pptx
PPT
Chapter 4 260110 044531
PPTX
CLO2-PPT2-Probability Addition Rules.pptx
PPTX
BSM with Sofware package for Social Sciences
DOCX
1 Probability Please read sections 3.1 – 3.3 in your .docx
PPTX
BHARAT & KAJAL.pptx
PPTX
Probabilty1.pptx
PPT
Addition Rule of Probability Math 10.ppt
PPT
Chapter 05
PPTX
Probability.pptx
PPT
03+probability+distributions.ppt
PPTX
Introduction to Probability and Bayes' Theorom
PDF
[Junoon - E - Jee] - Probability - 13th Nov.pdf
PPT
Probability concepts and procedures law of profitability
PPTX
maths ca1 - Copy.pptx swswswqswsswqsqwssqwqw
PPTX
maths ca1.pptx ds45hgcr5rtv7vtcuvr6d6tgu8u8
PPTX
Probability Theory
PPTX
L1-Fundamentals of Probability. gives fundamentals of statistics and probabil...
PPTX
Chapter_01.1 Concept Of Probability.pptx
PDF
11.5 Independent and Dependent Events
Quarter 3 - Statistics and Probability.pptx
Chapter 4 260110 044531
CLO2-PPT2-Probability Addition Rules.pptx
BSM with Sofware package for Social Sciences
1 Probability Please read sections 3.1 – 3.3 in your .docx
BHARAT & KAJAL.pptx
Probabilty1.pptx
Addition Rule of Probability Math 10.ppt
Chapter 05
Probability.pptx
03+probability+distributions.ppt
Introduction to Probability and Bayes' Theorom
[Junoon - E - Jee] - Probability - 13th Nov.pdf
Probability concepts and procedures law of profitability
maths ca1 - Copy.pptx swswswqswsswqsqwssqwqw
maths ca1.pptx ds45hgcr5rtv7vtcuvr6d6tgu8u8
Probability Theory
L1-Fundamentals of Probability. gives fundamentals of statistics and probabil...
Chapter_01.1 Concept Of Probability.pptx
11.5 Independent and Dependent Events
Ad

More from MELANIEZARATE4 (13)

PPT
G10 Math Q4-WEEK 2- Relation of Position.ppt
PPTX
HOPE 1 GRADE 11 ETTIQUETTE IN PHYSICAL EDUCATION.pptx
PPTX
pft-training-Part-I-HRF-Revised PFT.pptx
PPTX
Mathematics 7 Quarter 3 Week 7 - Circles
PPTX
ATHLETICS EVENTS (Running, Throwing, Jumping.pptx
PPTX
INTRODUCING THE LANGUAGE OF ALGEBRA.pptx
PPTX
HOPE 2Managing One STRESS THROUGH SPORTS.pptx
DOCX
2ND-SEM-Science-week-1.docx
DOCX
1ST-SEM-Science-week-6.docx
DOCX
1ST-SEM-Science-week-5.docx
DOCX
1ST-SEM-PE-week-1.docx
DOCX
1ST-SEM-Science-week-1.docx
DOCX
1ST-SEM-ESL-week-1 -.docx
G10 Math Q4-WEEK 2- Relation of Position.ppt
HOPE 1 GRADE 11 ETTIQUETTE IN PHYSICAL EDUCATION.pptx
pft-training-Part-I-HRF-Revised PFT.pptx
Mathematics 7 Quarter 3 Week 7 - Circles
ATHLETICS EVENTS (Running, Throwing, Jumping.pptx
INTRODUCING THE LANGUAGE OF ALGEBRA.pptx
HOPE 2Managing One STRESS THROUGH SPORTS.pptx
2ND-SEM-Science-week-1.docx
1ST-SEM-Science-week-6.docx
1ST-SEM-Science-week-5.docx
1ST-SEM-PE-week-1.docx
1ST-SEM-Science-week-1.docx
1ST-SEM-ESL-week-1 -.docx
Ad

Recently uploaded (20)

PDF
RMMM.pdf make it easy to upload and study
PPTX
Lesson notes of climatology university.
PPTX
1st Inaugural Professorial Lecture held on 19th February 2020 (Governance and...
PPTX
202450812 BayCHI UCSC-SV 20250812 v17.pptx
PPTX
Final Presentation General Medicine 03-08-2024.pptx
PDF
FourierSeries-QuestionsWithAnswers(Part-A).pdf
PPTX
Pharma ospi slides which help in ospi learning
PDF
RTP_AR_KS1_Tutor's Guide_English [FOR REPRODUCTION].pdf
PPTX
IMMUNITY IMMUNITY refers to protection against infection, and the immune syst...
PPTX
Introduction-to-Literarature-and-Literary-Studies-week-Prelim-coverage.pptx
PPTX
Tissue processing ( HISTOPATHOLOGICAL TECHNIQUE
PDF
A systematic review of self-coping strategies used by university students to ...
PDF
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
PDF
Microbial disease of the cardiovascular and lymphatic systems
PDF
Classroom Observation Tools for Teachers
PDF
Complications of Minimal Access Surgery at WLH
PDF
Abdominal Access Techniques with Prof. Dr. R K Mishra
PPTX
Cell Types and Its function , kingdom of life
PPTX
Pharmacology of Heart Failure /Pharmacotherapy of CHF
PDF
Anesthesia in Laparoscopic Surgery in India
RMMM.pdf make it easy to upload and study
Lesson notes of climatology university.
1st Inaugural Professorial Lecture held on 19th February 2020 (Governance and...
202450812 BayCHI UCSC-SV 20250812 v17.pptx
Final Presentation General Medicine 03-08-2024.pptx
FourierSeries-QuestionsWithAnswers(Part-A).pdf
Pharma ospi slides which help in ospi learning
RTP_AR_KS1_Tutor's Guide_English [FOR REPRODUCTION].pdf
IMMUNITY IMMUNITY refers to protection against infection, and the immune syst...
Introduction-to-Literarature-and-Literary-Studies-week-Prelim-coverage.pptx
Tissue processing ( HISTOPATHOLOGICAL TECHNIQUE
A systematic review of self-coping strategies used by university students to ...
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
Microbial disease of the cardiovascular and lymphatic systems
Classroom Observation Tools for Teachers
Complications of Minimal Access Surgery at WLH
Abdominal Access Techniques with Prof. Dr. R K Mishra
Cell Types and Its function , kingdom of life
Pharmacology of Heart Failure /Pharmacotherapy of CHF
Anesthesia in Laparoscopic Surgery in India

G10 Math Q4-Week 1- Mutually Exclusive.ppt

  • 2. Theorems of Probability  There are 2 important theorems of probability which are as follows: 2 The Addition Theorem and The Multiplication Theorem
  • 3. Addition theorem when events are Mutually Exclusive  Definition: - It states that if 2 events A and B are mutually exclusive then the probability of the occurrence of either A or B is the sum of the individual probability of A and B.  Symbolically 3 P(A or B) or P(A U B) = P(A) + P(B) P(A or B or C) = P(A) + P(B) + P(C) The theorem can be extended to three or more mutually exclusive events. Thus,
  • 4. Addition theorem when events are not Mutually Exclusive (Overlapping or Intersection Events)  Definition: - It states that if 2 events A and B are not mutually exclusive then the probability of the occurrence of either A or B is the sum of the individual probability of A and B minus the probability of occurrence of both A and B.  Symbolically 4 P(A or B) or P(A U B) = P(A) + P(B) – P(A ∩ B)
  • 5. Mutually Exclusive Events Two events are mutually exclusive if they cannot occur at the same time (i.e., they have no outcomes in common). In the Venn Diagram above, the probabilities of events A and B are represented by two disjoint sets (i.e., they have no elements in common). Non-Mutually Exclusive Events Two events are non-mutually exclusive if they have one or more outcomes in common. In the Venn Diagram above, the probabilities of events A and B are represented by two intersecting sets (i.e., they have some elements in common).
  • 6. The Addition Rule: Mutually Exclusive P(A or B) = P(A) + P(B) The Addition Rule: Non-mutually Exclusive P(A or B) = P(A)+P(B) - P(A and B) Probability of A and B happening together Probability of B happening Probability of A happening Probability of A or B happening when and B are not Mutually exclusive Probability of either A or B happening
  • 7. Multiplication theorem  Definition: States that if 2 events A and B are independent, then the probability of the occurrence of both of them (A & B) is the product of the individual probability of A and B.  Symbolically, Probability of happening of both the events: P(A and B) or P(A ∩ B) = P(A) x P(B) P(A, B and C) or P(A ∩ B ∩ C) = P(A) x P(B) x P(C) Theorem can be extended to 3 or more independent events. Thus,
  • 8. How to calculate probability in case of Dependent Events Case Formula 1. Probability of occurrence of at least A or B 1. When events are mutually 2. When events are not mutually exclusive 2. Probability of occurrence of both A & B 3. Probability of occurrence of A & not B 4. Probability of occurrence of B & not A 5. Probability of non-occurrence of both A & B 6. Probability of non-occurrence of atleast A or B P(A U B) = P(A) + P(B) P(A U B) = P(A) + P(B) – P(A ∩ B) P(A ∩ B) = P(A) + P(B) – P(A U B) P(A ∩ B) = P(A) - P(A ∩ B) P(A ∩ B) = P(B) - P(A ∩ B) P(A ∩ B) = 1 - P(A U B) P(A U B) = 1 - P(A ∩ B) 8
  • 9. How to calculate probability in case of Independent Events Case Formula 1. Probability of occurrence of both A & B 2. Probability of non-occurrence of both A & B 3. Probability of occurrence of A & not B 4. Probability of occurrence of B & not A 5. Probability of occurrence of atleast one event 6. Probability of non-occurrence of atleast one event 7. Probability of occurrence of only one event P(A ∩ B) = P(A) x P(B) P(A ∩ B) = P(A) x P(B) P(A ∩ B) = P(A) x P(B) P(A ∩ B) = P(A) x P(B) P(A U B) = 1 - P(A ∩ B) = 1 – [P(A) x P(B)] P(A U B) = 1 - P(A ∩ B) = 1 – [P(A) x P(B)] P(A ∩ B) + P(A ∩ B) = [P(A) x P(B)] + [P(A) x P(B)]
  • 10. Problem  An inspector of the Alaska Pipeline has the task of comparing the reliability of 2 pumping stations. Each station is susceptible to 2 kinds of failure: Pump failure & leakage. When either (or both) occur, the station must be shut down. The data at hand indicate that the following probabilities prevail: Station P(Pump failure)P(Leakage) P(Both) 1 0.07 0.10 0 2 0.09 0.12 0.06 Which station has the higher probability of being shut down. 10
  • 11. Solution P(Pump failure or Leakage) = P(Pump Failure) + P(Leakage Failure) – P(Pump Failure ∩ Leakage Failure) 11 Station 1: = 0.07 + 0.10 – 0 = 0.17 Station 2: = 0.09 + 0.12 – 0.06 = 0.15 Thus, station 1 has the higher probability of being shut down.
  • 13. Probabilities under conditions of Statistical Independence  Statistically Independent Events: - The occurrence of one event has no effect on the probability of the occurrence of any other event Most managers who use probabilities are concerned with 2 conditions. 1. The case where one event or another will occur. 2. The situation where 2 or more. Events will both occur.
  • 14.  There are 3 types of probabilities under statistical independence. Marginal Joint Conditional Marginal/ Unconditional Probability: - A single probability where only one event can take place. . Joint probability: - Probability of 2 or more events occurring together or in succession. Conditional probability: - Probability that a second event (B) will occur if a first event (A) has already happened
  • 15. Example: Marginal Probability - Statistical Independence  A single probability where only one event can take place. Marginal Probability of an Event P(A) = P(A) Example 1: - On each individual toss of an biased or unfair coin, P(H) = 0.90 & P(T) = 0.10. The outcomes of several tosses of this coin are statistically independent events too, even tough the coin is biased. Example 2: - 50 students of a school drew lottery to see which student would get a free trip to the Carnival at Goa. Any one of the students can calculate his/ her chances of winning as: P(Winning) = 1/50 = 0.02
  • 16. Example: Joint Probability - Statistical Independence  The probability of 2 or more independent events occurring together or in succession is the product of their marginal probabilities. Joint Probability of 2 Independent Events P(AB) = P(A) * P(B) Example: - What is the probability of heads on 2 successive tosses? P(H1H2) = P(H1) * P(H2) = 0.5 * 0.5 = 0.25 The probability of heads on 2 successive tosses is 0.25, since the probability of any outcome is not affected by any preceding outcome.
  • 17.  We can make the probabilities of events even more explicit using a Probabilistic Tree. 1 Toss 2 Toss 3 Toss H1 0.5 H1H2 0.25 H1H2H3 0.125 T1 0.5 H1T2 0.25 H1H2T3 0.125 T1H2 0.25 H1T2H3 0.125 T1T2 0.25 H1T2T3 0.125 T1H2H3 0.125 T1H2T3 0.125 T1T2H3 0.125 T1T2T3 0.125
  • 18. Example: Conditional Probability - Statistical Independence  For statistically independent events, conditional probability of event B given that event A has occurred is simply the probability of event B. Conditional Probability for 2 Independent Events P(B|A) = P(B) Example: - What is the probability that the second toss of a fair coin will result in heads, given that heads resulted on the first toss? P(H2|H1) = 0.5 For 2 independent events, the result of the first toss have absolutely no effect on the results of the second toss.
  • 19. Probabilities under conditions of Statistical Dependence  Statistical Dependence exists when the probability of some event is dependent on or affected by the occurrence of some other event.  The types of probabilities under statistical dependence are: • Marginal • Joint • Conditional
  • 20. Example  Assume that a box contains 10 balls distributed as follows: -  3 are colored & dotted  1 is colored & striped  2 are gray & dotted  4 are gray & striped Event Probability of Event 1 0.1 Colored & Dotted 2 0.1 3 0.1 4 0.1 Colored & Striped 5 0.1 Gray & Dotted 6 0.1 7 0.1 Gray & Striped 8 0.1 9 0.1 10 0.1
  • 21. Example: Marginal Probability - Statistically Dependent  It can be computed by summing up all the joint events in which the simple event occurs.  Compute the marginal probability of the event colored. It can be computed by summing up the probabilities of the two joint events in which colored occurred: P(C) = P(CD) + P(CS) = 0.3 + 0.1 = 0.4
  • 22. Example: Joint Probability - Statistically Dependent  Joint probabilities under conditions of statistical dependence is given by Joint probability for Statistically Dependent Events P(BA) = P(B|A) * P(A) •What is the probability that this ball is dotted and colored? Probability of colored & dotted balls = P(DC) = P(D|C) * P(D) = (0.3/0.4) * 0.5 = 0.375
  • 23. Example: Conditional Probability - Statistically Dependent  Given A & B to be the 2 events then, Conditional probability for Statistically Dependent Events P(BA) P(B|A) = ---------- P(A) Probability of event B given that event has occurred P(B|A)
  • 24. What is the probability that this ball is dotted, given that it is colored? The probability of drawing any one of the ball from this box is 0.1 (1/10) [Total no. of balls in the box = 10].
  • 25. We know that there are 4 colored balls, 3 of which are dotted & one of it striped. P(DC) 0.3 P(D|C) = --------- = ------ P(C) 0.4 = 0.75 P(DC) = Probability of colored & dotted balls (3 out of 10 --- 3/10) P(C) = 4 out of 10 --- 4/10
  • 26. Type of probability Symbo l Formula under statistical Independen ce Formula under statistical dependence 1. Marginal P(A) P(A) P(A) 2. Joint P(AB) P(A) x P(B) P(A|B) x P(B) 3. Conditional P(A|B) P(A) P(AB) P(B)
  • 27. Revising Prior Estimates of Probabilities: Bayes’ Theorem  A very important & useful application of conditional probability is the computation of unknown probabilities, based on past data or information.  When an event occurs through one of the various mutually disjoint events, then the conditional probability that this event has occurred due to a particular reason or event is termed as Inverse Probability or Posterior Probability.  Has wide ranging applications in Business & its Management.
  • 28.  Since it is a concept of revision of probability based on some additional information, it shows the improvement towards certainty level of the event.  Example 1: - If a manager of a boutique finds that most of the purple & white jackets that she thought would sell so well are hanging on the rack, she must revise her prior probabilities & order a different color combination or have a sale.  Certain probabilities were altered after the people got additional information. New probabilities are known as revised, or Posterior probabilities.
  • 29. Bayes Theorem  If an event A can occur only in conjunction with n mutually exclusive & exhaustive events B1, B2, …, Bn, & if A actually happens, then the probability that it was preceded by an event Bi (for a conditional probabilities of A given B1, A given B2 … A given Bn are known) & if marginal probabilities P(Bi) are also known, then the posterior probability of event Bi given that event A has occurred is given by: P(A | Bi). P(Bi) P(Bi | A) = ---------------------- ∑ P(A | Bi). P(Bi)
  • 30. Remarks: -  The probabilities P(B1), P(B2), … , P(Bn) are termed as the ‘a priori probabilities’ because they exist before we gain any information from the experiment itself. The probabilities P(A | Bi), i=1,2,…,n are called ‘Likelihoods’ because they indicate how likely the event A under consideration is to occur, given each & every a priori probability. The probabilities P(Bi | A), i=1, 2, …,n are called ‘Posterior probabilities’ because they are determined after the results of the experiment are known.
  • 32. Problem  In a bolt factory machines A, B, & C manufacture respectively 25%, 35%, & 40% of the total. Of their output 5%, 4%, 2% are defective bolts. A bolt is drawn at random from the product & Is found to be defective. What are the probabilities that it was manufactured by machines A, B & C?
  • 33. Solution  Let E1, E2, E3 denote the events manufactured by machines A, B & C respectively.  Let E denote the event of its being defective. P(E1) = 0.25; P(E2) = 0.35; P(E3) = 0.40; Probability of drawing a defective bolt manufactured by machine A is P(E|E1) = 0.05 Similarly P(E|E2) = 0.04; P(E|E3) = 0.02 Probability that defective bolt selected at random is manufactured by machine A is given by
  • 34. P(E1). P(E|E1) P(E1|E) = ------------------------ ∑ P(E1). P(E|E1) i=1 to 3 0.25*0.05 = ---------------------------------------------- 0.25*0.05 + 0.35*0.04 + 0.40*0.02 = 25/69 Similarly P(E2|E) = 28/69 = [(0.35*0.04)/(.25*.05+.35*.04+.40*.02)] P(E3|E) = 16/69 = [(0.40*0.02)/(.25*.05+.35*.04+.40*.02)]
  • 35. Suppose that one person in 100, 000 has a particular rare disease for which there is a fairly accurate diagnostic test. This test is correct 99% of the time when to someone with the disease; it is correct 99.5% of the time when given to someone who does not have the disease. Given this information can we find (a) the probability that someone who tests positive for the disease has the disease? (b) the probability that someone who tests negative for the disease does not have the disease? Should someone who tests positive be very concerned that he or she has the disease?
  • 36. Glossary of terms  Classical Probability: It is based on the idea that certain occurrences are equally likely.  Example: - Numbers 1, 2, 3, 4, 5, & 6 on a fair die are each equally likely to occur.  Conditional Probability: The probability that an event occurs given the outcome of some other event.  Independent Events: Events are independent if the occurrence of one event does not affect the occurrence of another event.  Joint Probability: Is the likelihood that 2 or more events will happen at the same time.  Multiplication Formula: If there are m ways of doing one thing and n ways of doing another thing, there are m x n ways of doing both.
  • 37.  Mutually exclusive events: A property of a set of categories such that an individual, object, or measurement is included in only one category.  Objective Probability: It is based on symmetry of games of chance or similar situations.  Outcome: Observation or measurement of an experiment.  Posterior Probability: A revised probability based on additional information.  Prior Probability: The initial probability based on the present level of information.  Probability: A value between 0 and 1, inclusive, describing the relative possibility (chance or likelihood) an event will occur.  Subjective Probability: Synonym for personal probability. Involves personal judgment, information, intuition, & other subjective evaluation criteria.  Example: - A physician assessing the probability of a patient’s recovery is making a personal judgment based on what they know and feel about the situation.
  • 38. Content, graphics and text Content, graphics and text belong to the rightful belong to the rightful owner. owner. No copyright intended No copyright intended