SlideShare a Scribd company logo
4
Most read
8
Most read
10
Most read
~1~
Sawsan Monir * 20142629
DEFINITION of 'Decision Analysis - DA'
A systematic, quantitative and visual approach to addressing and
evaluating important choices confronted by businesses. Decision
analysis utilizes a variety of tools to evaluate all relevant information
to aid in the decision making process. A graphical representation of
alternatives and possible solutions, as well as challenges and
uncertainties, can be created on a decision tree or influence diagram.
Decision analysis (DA) has been applied to business problems in
management, marketing, operations, accounting, and finance. In
addition, it has had an impact on the fields of medicine, law, military
science, environmental sciences, and public policy more generally.
The Six Steps in Decision Analysis
1. Clearly define the problem at hand
2. List the possible alternatives
3. Identify the possible outcomes or states of nature
4. List the payoff or profit of each combination of alternatives and
outcomes
5. Select one of the mathematical decision theory models
6. Apply the model and make your decision
Step 1 – Define the problem
Expand by manufacturing and marketing a new product, backyard
storage sheds
Step 2 – List alternatives
Construct a large new plant, small plant or no plant at all
Decision analysis
~2~
Sawsan Monir * 20142629
Step 3 – Identify possible outcomes
The market could be favorable or unfavorable
Types of Decision-Making Environments
Type 1: Decision making under certainty
Decision maker knows with certainty the consequences of every
alternative or decision choice
Type 2: Decision making under uncertainty
The decision maker does not know the probabilities of the various
outcomes
There are several criteria for making decisions under uncertainty
1. Maximax (optimistic)
Used to find the alternative that maximizes the maximum payoff
 Locate the maximum payoff for each alternative
 Select the alternative with the maximum number
State of nature
alternative Favorable
market ($)
Unfavorable
market ($)
Maximum in a
raw ($)
Construct a
large plant
200,000 –180,000 200,000
maximax
Construct a
small plant
100,000 –20,000 100,000
Do nothing 0 0 0
2. Maximin (pessimistic)
Used to find the alternative that maximizes the minimum payoff
~3~
Sawsan Monir * 20142629
 Locate the minimum payoff for each alternative
 Select the alternative with the maximum number
State of nature
alternative Favorable
market ($)
Unfavorable
market ($)
Maximum in a
raw ($)
Construct a
large plant
200,000 –180,000 200,000
Construct a
small plant
100,000 –20,000 100,000
Do nothing 0 0 0 maximin
3. Criterion of realism (Hurwicz)
A weighted average compromise between optimistic and
pessimistic
 Select a coefficient of realism α
 Coefficient is between 0 and 1
 A value of 1 is 100% optimistic
 Compute the weighted averages for each alternative
 Select the alternative with the highest value
 Weighted average = α (maximum in row) + (1 – α )(minimum
in row)
o For the large plant alternative using α = 0.8
(0.8)(200,000) + (1 – 0.8)(–180,000) = 124,000
o For the small plant alternative using α = 0.8
(0.8)(100,000) + (1 – 0.8)(–20,000) = 76,000
State of nature
~4~
Sawsan Monir * 20142629
alternative Favorable
market ($)
Unfavorable
market ($)
Criterion of
realism
(α=0.8)$
Construct a
large plant
200,000 –180,000 124,000
realism
Construct a
small plant
100,000 –20,000 76,000
Do nothing 0 0 0
4. Equally likely (Laplace)
Considers all the payoffs for each alternative
 Find the average payoff for each alternative
 Select the alternative with the highest average
State of nature
alternative Favorable
market ($)
Unfavorable
market ($)
Raw average
($)
Construct a
large plant
200,000 –180,000 10,000
Construct a
small plant
100,000 –20,000 40,000
equally likely
Do nothing 0 0 0
5. Minimax regret
Based on opportunity loss or regret, the difference between the
optimal profit and actual payoff for a decision
 Create an opportunity loss table by determining the
opportunity loss for not choosing the best alternative
~5~
Sawsan Monir * 20142629
 Opportunity loss is calculated by subtracting each payoff in the
column from the best payoff in the column
 Find the maximum opportunity loss for each alternative and
pick the alternative with the minimum number
State of nature
alternative Favorable
market ($)
Unfavorable
market ($)
Maximum in a
raw ($)
Construct a
large plant
200,000 –
200,000 = 0
0 – (–180,000)
= 180,000
180,000
Construct a
small plant
200,000 –
100,000 =
100,000
0 – (–20,000) =
20,000
100,000
minimax
Do nothing 200,000 – 0 =
200,000
0 – 0 = 0 200,000
Type 3: Decision making under risk
The decision maker knows the probabilities of the various outcomes
 Decision making when there are several possible states of
nature and we know the probabilities associated with each
possible state
 Most popular method is to choose the alternative with the
highest expected monetary value (EMV)
 (alternative i) = (payoff of first state of nature) x (probability
of first state of nature) + payoff of second state of nature x
(probability of second state of nature) + … +payoff of last
state of nature x (probability of last state of nature)
~6~
Sawsan Monir * 20142629
Any problem that can be presented in a decision table can also
be graphically represented in a decision tree
 Decision trees are most beneficial when a sequence of decisions
must be made
 All decision trees contain decision points or nodes and state-of-
nature points or nodes
 A decision node from which one of several alternatives may be
chosen
 A state-of-nature node out of which one state of nature will
occur
Markov chain
~7~
Sawsan Monir * 20142629
We describe a Markov chain as follows: We have a set of states, S = }
s1, s2 … sr{.
The process starts in one of these states
and moves successively from one state
to
another. Each move is called a step. If
the chain is currently in state si, then
it moves to state sj at the next step with
a probability denoted by pij , and this
probability does not depend upon which states the chain was in
before the current state.
The probabilities pij are called transition probabilities. The process
can remain
in the state it is in, and this occurs with probability pii. An initial
probability
distribution, defined on S, specifies the starting state. Usually this is
done by
specifying a particular state as the starting state.
Markov models are useful when a decision problem involves risk that
is continuous over time, when the timing of events is important, and
when important events may happen more than once. Markov
models assume that a patient is always in one of a finite number of
discrete health states, called Markov states. The ability of the Markov
model to represent repetitive events and the time dependence of
both probabilities and utilities allows for more accurate
representation of clinical settings that involve these issues.
~8~
Sawsan Monir * 20142629
Markov models are particularly useful when a decision problem
involves a risk that is ongoing over time. Some clinical examples are
the risk of hemorrhage while on anticoagulant therapy, the risk of
rupture of an abdominal aortic aneurysm, and the risk of mortality in
any person, whether sick or healthy.
There are two important consequences of events that have ongoing
risk. First, the times at which the events will occur are uncertain. For
example, a stroke that occurs immediately may have a different
impact on the patient than one that occurs ten years later. The
second consequence is that a given event may occur more than
once. As the following example shows, representing events that are
repetitive or that occur with uncertain timing is difficult using a
simple tree model.
There are five steps for Markov modeling:
(1) choose the health states that represent the possible outcomes
from each intervention; (2) determine possible transitions between
health states; (3) choose how long each cycle should be and how
many cycles will be analyzed; (4) estimate the probabilities
associated with moving (i.e., transitioning) in and out of health
states; (5) estimate the costs and outcomes associated with each
option.
Step 1: Choose Health States
These are referred to as Markov states. Patients cannot be in more
than one health state during each cycle. A simple general example is
“well, sick, or dead.” Graphically, by convention, each health state is
placed in an oval or circle in a bubble diagram.
Step 2: Determine Transitions
~9~
Sawsan Monir * 20142629
It is that patients move (i.e.,
transition) from one health
state to another. For example,
if the patient dies, this is called
an absorbing state. An
absorbing state indicates that
patients cannot move to
another health state in a later
cycle. Graphically, arrows are
used to indicate which
transitions are allowed.
For cycle 1, each patient can stay well, or can move to the sick or
dead states. For the next cycle, patients in the well state can again
stay well or move to the sick or dead states. Those in the dead state
cannot move back to the other two states. Depending on the disease
of interest, patients may or may not be able to move back to the well
state after being in the sick state.
Step 3: Choose the Cycle Length and Number of Cycles
The cycle length depends on the disease being modeled. For the
example of patients with a blood clot, a cycle of 1 week might be
enough time to determine the number of patients with additional
blood clots or bleeding. For chronic diseases, a cycle length of 1
year is commonly used.
Step 4: Estimate Transition Probabilities
Transition probabilities are used to estimate the percent of
patients who are
~10~
Sawsan Monir * 20142629
likely to move from one health state to another during each cycle.
These probability values usually come from previous research or
expert panel estimates.
Step 5: Calculate Costs and Outcomes
Outcomes for each health state should be estimated and given a
value. If the outcome of interest is years of life gained or saved
and each cycle is for 1 year, then
each person who is alive during a
cycle gets a value of 1.0 as his or
her outcome for that cycle. It is
common to adjust each year of life in each cycle for the quality of
health that year.
The two basic calculation methods used to determine the results
of a Markov analysis are cohort simulation and Monte Carlo
simulation.
Cohort simulation uses a hypothetical group (cohort) of patients
that usually
start out in the same health state. At each cycle, the transition
probabilities are applied. (Probabilities may be the same for every
cycle if using a Markov chain analysis, or they may vary by cycle if
using a Markov process analysis.) The number of patients in each
cycle is calculated and summed using matrix algebra. This type of
calculation can incorporate discount rates to account for time
value associated with costs and outcomes.
Monte Carlo simulation is a type of stochastic analysis that
takes into account uncertainty or variability at the patient level. A
random patient is sent through the model, and outcomes and
costs are calculated individually for that patient. Then one by one,
~11~
Sawsan Monir * 20142629
more random patients are sent through the model. The path
through the model that each patient may take is different because
of random variation, and results for a specific model can result in
different answers each time the simulation is conducted because
of the randomness at chance nodes in the model. If a large
number of patients (e.g., 100,000) are sent through the model one
at a time, the results may be close to the results of the cohort
simulation.
Disadvantages of Markov Modeling
By their nature, Markov models can be more complex than simple
decision trees and therefore less transparent to decision makers.
A commonly cited disadvantage of Markov modeling is that it is
“memoryless” because the Markovian assumption is that the
probability of moving from state to state is not based on the
previous experiences from former cycles.
More advanced and complex computations, such as using tunnel
states, allow for integration of health experiences from previous
cycles. Another disadvantage is that the data needed to estimate
probabilities and costs, especially in the long term, are often
unavailable.
Markov models can therefore become somewhat contrived if
these implicit assumptions do not reflect sufficiently well the
characteristics of a system and how it functions in practice.
 Can require a large number of states
 Model can be difficult to construct and validate
 "Markov" Property assumption and component failure
distribution assumptions may be invalid for the system being
modeled
~12~
Sawsan Monir * 20142629
 Model types of greatest complexity require solution
techniques that are currently feasible only for small models
 Model is often not structurally similar to the physical or
logical organization of the system
Markov modeling advantages:
Markov analysis has the advantage of being an analytical method
which means that the reliability parameters for the system are
calculated in effect by a formula. This has the considerable
advantages of speed and accuracy when producing results. Speed
is especially useful when investigating many alternative variations
of design or exploring a range of sensitivities. In contrast accuracy
is vitally important when investigating small design changes or
when the reliability or availability of high integrity systems are
being quantified.
 Can model repair in a natural way:
 Repairs of individual components and groups
 Variable number of repair persons
 Sequential repair; Partial repair (degraded components)
 Can model standby spares (hot, warm, cold)
 Can model sequence dependencies:
 Functional dependencies
 Sequence enforcement
 Can model imperfect coverage more naturally than
combinatorial models
 Can model fault/error handling and recovery at a detailed
level

More Related Content

PPTX
Prescription event monitoring and record linkage system
PDF
TDM of psychiatric drugs
PPTX
Pharmacoeconomics pptx
PPTX
Measurement of outcome v5
PPTX
Drug utilisation evaluation
PPTX
Hosptal pharmacy management
PPTX
Reporting of pharmacovigilance and role of pharmacist
PPT
Measurement of outcomes in pharacoepidemiology
Prescription event monitoring and record linkage system
TDM of psychiatric drugs
Pharmacoeconomics pptx
Measurement of outcome v5
Drug utilisation evaluation
Hosptal pharmacy management
Reporting of pharmacovigilance and role of pharmacist
Measurement of outcomes in pharacoepidemiology

What's hot (20)

PPTX
OUTCOME MEASURES & DRUG USE MEASURE.pptx
PPTX
Dosing in elderly
PPTX
Analysis of pk data- Pop PK analysis
PPTX
1 pharmacy computer services(1)
PDF
SOFTWARE USED IN P'epidemiology.pdf
DOCX
AUTOMATED DATABASES INTRO.docx
PPT
Quality Use of Medicines
PDF
CARBAMAZEPINE TDM: @ RxVichuZ!! ;)
PPTX
Drug utilisation review
PPTX
Measurement of medication adherence
PPTX
14ab1 t0011 professional relations and practices of hospital pharmacy
PPTX
NOMOGRAMS AND TABULATIONS IN DESIGNING DOSAGE REGIMEN.pptx
PPTX
Hospital Pharmaco-epidemiology
PPTX
ROLE OF INDUSTRY IN QUM IN MEDICINE DEVELOPMENT 1.pptx
PPTX
BUILDING BLOCKS & evaluation process in qum.pptx
PPTX
Pharmacoepidemiology and risk management
PPTX
Advesre drug reaction- Types, Reporting, Evaluation, Monitoring, Preventing &...
PPTX
Patient data analysis
PDF
poison and drug info.pdf
PDF
Population pharmacokinetics
OUTCOME MEASURES & DRUG USE MEASURE.pptx
Dosing in elderly
Analysis of pk data- Pop PK analysis
1 pharmacy computer services(1)
SOFTWARE USED IN P'epidemiology.pdf
AUTOMATED DATABASES INTRO.docx
Quality Use of Medicines
CARBAMAZEPINE TDM: @ RxVichuZ!! ;)
Drug utilisation review
Measurement of medication adherence
14ab1 t0011 professional relations and practices of hospital pharmacy
NOMOGRAMS AND TABULATIONS IN DESIGNING DOSAGE REGIMEN.pptx
Hospital Pharmaco-epidemiology
ROLE OF INDUSTRY IN QUM IN MEDICINE DEVELOPMENT 1.pptx
BUILDING BLOCKS & evaluation process in qum.pptx
Pharmacoepidemiology and risk management
Advesre drug reaction- Types, Reporting, Evaluation, Monitoring, Preventing &...
Patient data analysis
poison and drug info.pdf
Population pharmacokinetics
Ad

Similar to Decision analysis & Markov chain (20)

PPTX
ch 2 qm.pptxhttps://www.slideshare.net/slideshow/ppt-solarpptx/265379690
PPT
Operations Research chapter 5.- decision theory ppt
PPTX
Decision theory
PPTX
Decisiontree&game theory
PPT
Decision Making
PPT
APPLICATIONS decision making APPLICATIONS .ppt
PDF
Decision Theory Notes operations research.pdf
PPT
Lecture notes about system analysis 7.ppt
PPT
Bba 3274 qm week 4 decision analysis
PPTX
Decision theory
PPT
ch.7 Decision theory and decision maker .ppt
PPT
Decision Theory
PPTX
Decision theory
PDF
Decision Theory in Operational Research By Faziii
PPTX
OR CHAPTER FOUR.pptx
PPTX
Decision theory
PPTX
Decision theory
PPTX
Unit-IV-Decision-tree.pptxjdjdjjdsjsjjejekskendnd
PPTX
MBA 623_Decision Theory presentation for masters degree class.pptx
PPTX
Ch 4.pptx it IS ALL ABOUT TRNAPORTAION PROBLEM AND ANAYLSIS
ch 2 qm.pptxhttps://www.slideshare.net/slideshow/ppt-solarpptx/265379690
Operations Research chapter 5.- decision theory ppt
Decision theory
Decisiontree&game theory
Decision Making
APPLICATIONS decision making APPLICATIONS .ppt
Decision Theory Notes operations research.pdf
Lecture notes about system analysis 7.ppt
Bba 3274 qm week 4 decision analysis
Decision theory
ch.7 Decision theory and decision maker .ppt
Decision Theory
Decision theory
Decision Theory in Operational Research By Faziii
OR CHAPTER FOUR.pptx
Decision theory
Decision theory
Unit-IV-Decision-tree.pptxjdjdjjdsjsjjejekskendnd
MBA 623_Decision Theory presentation for masters degree class.pptx
Ch 4.pptx it IS ALL ABOUT TRNAPORTAION PROBLEM AND ANAYLSIS
Ad

More from Sawsan Monir (6)

DOCX
Dendrimers for Target Drug Delivery In Treatment of Cancer
PPTX
treatment of Mushroom toxicity
PPTX
Scorpion venom
PDF
Nano spray drying technology for heat sensitive biopharmaceuticals
DOCX
Marketing
PPTX
Monosomy 7
Dendrimers for Target Drug Delivery In Treatment of Cancer
treatment of Mushroom toxicity
Scorpion venom
Nano spray drying technology for heat sensitive biopharmaceuticals
Marketing
Monosomy 7

Recently uploaded (20)

PDF
Roadmap Map-digital Banking feature MB,IB,AB
PDF
Katrina Stoneking: Shaking Up the Alcohol Beverage Industry
PDF
Training And Development of Employee .pdf
PDF
kom-180-proposal-for-a-directive-amending-directive-2014-45-eu-and-directive-...
PPTX
Belch_12e_PPT_Ch18_Accessible_university.pptx
PPTX
CkgxkgxydkydyldylydlydyldlyddolydyoyyU2.pptx
PPT
Data mining for business intelligence ch04 sharda
PPTX
5 Stages of group development guide.pptx
PDF
Solara Labs: Empowering Health through Innovative Nutraceutical Solutions
PDF
Elevate Cleaning Efficiency Using Tallfly Hair Remover Roller Factory Expertise
PPTX
Dragon_Fruit_Cultivation_in Nepal ppt.pptx
PDF
Reconciliation AND MEMORANDUM RECONCILATION
PDF
COST SHEET- Tender and Quotation unit 2.pdf
PDF
Chapter 5_Foreign Exchange Market in .pdf
PDF
IFRS Notes in your pocket for study all the time
PDF
Ôn tập tiếng anh trong kinh doanh nâng cao
PPTX
New Microsoft PowerPoint Presentation - Copy.pptx
PDF
Power and position in leadershipDOC-20250808-WA0011..pdf
PPTX
AI-assistance in Knowledge Collection and Curation supporting Safe and Sustai...
PDF
How to Get Funding for Your Trucking Business
Roadmap Map-digital Banking feature MB,IB,AB
Katrina Stoneking: Shaking Up the Alcohol Beverage Industry
Training And Development of Employee .pdf
kom-180-proposal-for-a-directive-amending-directive-2014-45-eu-and-directive-...
Belch_12e_PPT_Ch18_Accessible_university.pptx
CkgxkgxydkydyldylydlydyldlyddolydyoyyU2.pptx
Data mining for business intelligence ch04 sharda
5 Stages of group development guide.pptx
Solara Labs: Empowering Health through Innovative Nutraceutical Solutions
Elevate Cleaning Efficiency Using Tallfly Hair Remover Roller Factory Expertise
Dragon_Fruit_Cultivation_in Nepal ppt.pptx
Reconciliation AND MEMORANDUM RECONCILATION
COST SHEET- Tender and Quotation unit 2.pdf
Chapter 5_Foreign Exchange Market in .pdf
IFRS Notes in your pocket for study all the time
Ôn tập tiếng anh trong kinh doanh nâng cao
New Microsoft PowerPoint Presentation - Copy.pptx
Power and position in leadershipDOC-20250808-WA0011..pdf
AI-assistance in Knowledge Collection and Curation supporting Safe and Sustai...
How to Get Funding for Your Trucking Business

Decision analysis & Markov chain

  • 1. ~1~ Sawsan Monir * 20142629 DEFINITION of 'Decision Analysis - DA' A systematic, quantitative and visual approach to addressing and evaluating important choices confronted by businesses. Decision analysis utilizes a variety of tools to evaluate all relevant information to aid in the decision making process. A graphical representation of alternatives and possible solutions, as well as challenges and uncertainties, can be created on a decision tree or influence diagram. Decision analysis (DA) has been applied to business problems in management, marketing, operations, accounting, and finance. In addition, it has had an impact on the fields of medicine, law, military science, environmental sciences, and public policy more generally. The Six Steps in Decision Analysis 1. Clearly define the problem at hand 2. List the possible alternatives 3. Identify the possible outcomes or states of nature 4. List the payoff or profit of each combination of alternatives and outcomes 5. Select one of the mathematical decision theory models 6. Apply the model and make your decision Step 1 – Define the problem Expand by manufacturing and marketing a new product, backyard storage sheds Step 2 – List alternatives Construct a large new plant, small plant or no plant at all Decision analysis
  • 2. ~2~ Sawsan Monir * 20142629 Step 3 – Identify possible outcomes The market could be favorable or unfavorable Types of Decision-Making Environments Type 1: Decision making under certainty Decision maker knows with certainty the consequences of every alternative or decision choice Type 2: Decision making under uncertainty The decision maker does not know the probabilities of the various outcomes There are several criteria for making decisions under uncertainty 1. Maximax (optimistic) Used to find the alternative that maximizes the maximum payoff  Locate the maximum payoff for each alternative  Select the alternative with the maximum number State of nature alternative Favorable market ($) Unfavorable market ($) Maximum in a raw ($) Construct a large plant 200,000 –180,000 200,000 maximax Construct a small plant 100,000 –20,000 100,000 Do nothing 0 0 0 2. Maximin (pessimistic) Used to find the alternative that maximizes the minimum payoff
  • 3. ~3~ Sawsan Monir * 20142629  Locate the minimum payoff for each alternative  Select the alternative with the maximum number State of nature alternative Favorable market ($) Unfavorable market ($) Maximum in a raw ($) Construct a large plant 200,000 –180,000 200,000 Construct a small plant 100,000 –20,000 100,000 Do nothing 0 0 0 maximin 3. Criterion of realism (Hurwicz) A weighted average compromise between optimistic and pessimistic  Select a coefficient of realism α  Coefficient is between 0 and 1  A value of 1 is 100% optimistic  Compute the weighted averages for each alternative  Select the alternative with the highest value  Weighted average = α (maximum in row) + (1 – α )(minimum in row) o For the large plant alternative using α = 0.8 (0.8)(200,000) + (1 – 0.8)(–180,000) = 124,000 o For the small plant alternative using α = 0.8 (0.8)(100,000) + (1 – 0.8)(–20,000) = 76,000 State of nature
  • 4. ~4~ Sawsan Monir * 20142629 alternative Favorable market ($) Unfavorable market ($) Criterion of realism (α=0.8)$ Construct a large plant 200,000 –180,000 124,000 realism Construct a small plant 100,000 –20,000 76,000 Do nothing 0 0 0 4. Equally likely (Laplace) Considers all the payoffs for each alternative  Find the average payoff for each alternative  Select the alternative with the highest average State of nature alternative Favorable market ($) Unfavorable market ($) Raw average ($) Construct a large plant 200,000 –180,000 10,000 Construct a small plant 100,000 –20,000 40,000 equally likely Do nothing 0 0 0 5. Minimax regret Based on opportunity loss or regret, the difference between the optimal profit and actual payoff for a decision  Create an opportunity loss table by determining the opportunity loss for not choosing the best alternative
  • 5. ~5~ Sawsan Monir * 20142629  Opportunity loss is calculated by subtracting each payoff in the column from the best payoff in the column  Find the maximum opportunity loss for each alternative and pick the alternative with the minimum number State of nature alternative Favorable market ($) Unfavorable market ($) Maximum in a raw ($) Construct a large plant 200,000 – 200,000 = 0 0 – (–180,000) = 180,000 180,000 Construct a small plant 200,000 – 100,000 = 100,000 0 – (–20,000) = 20,000 100,000 minimax Do nothing 200,000 – 0 = 200,000 0 – 0 = 0 200,000 Type 3: Decision making under risk The decision maker knows the probabilities of the various outcomes  Decision making when there are several possible states of nature and we know the probabilities associated with each possible state  Most popular method is to choose the alternative with the highest expected monetary value (EMV)  (alternative i) = (payoff of first state of nature) x (probability of first state of nature) + payoff of second state of nature x (probability of second state of nature) + … +payoff of last state of nature x (probability of last state of nature)
  • 6. ~6~ Sawsan Monir * 20142629 Any problem that can be presented in a decision table can also be graphically represented in a decision tree  Decision trees are most beneficial when a sequence of decisions must be made  All decision trees contain decision points or nodes and state-of- nature points or nodes  A decision node from which one of several alternatives may be chosen  A state-of-nature node out of which one state of nature will occur Markov chain
  • 7. ~7~ Sawsan Monir * 20142629 We describe a Markov chain as follows: We have a set of states, S = } s1, s2 … sr{. The process starts in one of these states and moves successively from one state to another. Each move is called a step. If the chain is currently in state si, then it moves to state sj at the next step with a probability denoted by pij , and this probability does not depend upon which states the chain was in before the current state. The probabilities pij are called transition probabilities. The process can remain in the state it is in, and this occurs with probability pii. An initial probability distribution, defined on S, specifies the starting state. Usually this is done by specifying a particular state as the starting state. Markov models are useful when a decision problem involves risk that is continuous over time, when the timing of events is important, and when important events may happen more than once. Markov models assume that a patient is always in one of a finite number of discrete health states, called Markov states. The ability of the Markov model to represent repetitive events and the time dependence of both probabilities and utilities allows for more accurate representation of clinical settings that involve these issues.
  • 8. ~8~ Sawsan Monir * 20142629 Markov models are particularly useful when a decision problem involves a risk that is ongoing over time. Some clinical examples are the risk of hemorrhage while on anticoagulant therapy, the risk of rupture of an abdominal aortic aneurysm, and the risk of mortality in any person, whether sick or healthy. There are two important consequences of events that have ongoing risk. First, the times at which the events will occur are uncertain. For example, a stroke that occurs immediately may have a different impact on the patient than one that occurs ten years later. The second consequence is that a given event may occur more than once. As the following example shows, representing events that are repetitive or that occur with uncertain timing is difficult using a simple tree model. There are five steps for Markov modeling: (1) choose the health states that represent the possible outcomes from each intervention; (2) determine possible transitions between health states; (3) choose how long each cycle should be and how many cycles will be analyzed; (4) estimate the probabilities associated with moving (i.e., transitioning) in and out of health states; (5) estimate the costs and outcomes associated with each option. Step 1: Choose Health States These are referred to as Markov states. Patients cannot be in more than one health state during each cycle. A simple general example is “well, sick, or dead.” Graphically, by convention, each health state is placed in an oval or circle in a bubble diagram. Step 2: Determine Transitions
  • 9. ~9~ Sawsan Monir * 20142629 It is that patients move (i.e., transition) from one health state to another. For example, if the patient dies, this is called an absorbing state. An absorbing state indicates that patients cannot move to another health state in a later cycle. Graphically, arrows are used to indicate which transitions are allowed. For cycle 1, each patient can stay well, or can move to the sick or dead states. For the next cycle, patients in the well state can again stay well or move to the sick or dead states. Those in the dead state cannot move back to the other two states. Depending on the disease of interest, patients may or may not be able to move back to the well state after being in the sick state. Step 3: Choose the Cycle Length and Number of Cycles The cycle length depends on the disease being modeled. For the example of patients with a blood clot, a cycle of 1 week might be enough time to determine the number of patients with additional blood clots or bleeding. For chronic diseases, a cycle length of 1 year is commonly used. Step 4: Estimate Transition Probabilities Transition probabilities are used to estimate the percent of patients who are
  • 10. ~10~ Sawsan Monir * 20142629 likely to move from one health state to another during each cycle. These probability values usually come from previous research or expert panel estimates. Step 5: Calculate Costs and Outcomes Outcomes for each health state should be estimated and given a value. If the outcome of interest is years of life gained or saved and each cycle is for 1 year, then each person who is alive during a cycle gets a value of 1.0 as his or her outcome for that cycle. It is common to adjust each year of life in each cycle for the quality of health that year. The two basic calculation methods used to determine the results of a Markov analysis are cohort simulation and Monte Carlo simulation. Cohort simulation uses a hypothetical group (cohort) of patients that usually start out in the same health state. At each cycle, the transition probabilities are applied. (Probabilities may be the same for every cycle if using a Markov chain analysis, or they may vary by cycle if using a Markov process analysis.) The number of patients in each cycle is calculated and summed using matrix algebra. This type of calculation can incorporate discount rates to account for time value associated with costs and outcomes. Monte Carlo simulation is a type of stochastic analysis that takes into account uncertainty or variability at the patient level. A random patient is sent through the model, and outcomes and costs are calculated individually for that patient. Then one by one,
  • 11. ~11~ Sawsan Monir * 20142629 more random patients are sent through the model. The path through the model that each patient may take is different because of random variation, and results for a specific model can result in different answers each time the simulation is conducted because of the randomness at chance nodes in the model. If a large number of patients (e.g., 100,000) are sent through the model one at a time, the results may be close to the results of the cohort simulation. Disadvantages of Markov Modeling By their nature, Markov models can be more complex than simple decision trees and therefore less transparent to decision makers. A commonly cited disadvantage of Markov modeling is that it is “memoryless” because the Markovian assumption is that the probability of moving from state to state is not based on the previous experiences from former cycles. More advanced and complex computations, such as using tunnel states, allow for integration of health experiences from previous cycles. Another disadvantage is that the data needed to estimate probabilities and costs, especially in the long term, are often unavailable. Markov models can therefore become somewhat contrived if these implicit assumptions do not reflect sufficiently well the characteristics of a system and how it functions in practice.  Can require a large number of states  Model can be difficult to construct and validate  "Markov" Property assumption and component failure distribution assumptions may be invalid for the system being modeled
  • 12. ~12~ Sawsan Monir * 20142629  Model types of greatest complexity require solution techniques that are currently feasible only for small models  Model is often not structurally similar to the physical or logical organization of the system Markov modeling advantages: Markov analysis has the advantage of being an analytical method which means that the reliability parameters for the system are calculated in effect by a formula. This has the considerable advantages of speed and accuracy when producing results. Speed is especially useful when investigating many alternative variations of design or exploring a range of sensitivities. In contrast accuracy is vitally important when investigating small design changes or when the reliability or availability of high integrity systems are being quantified.  Can model repair in a natural way:  Repairs of individual components and groups  Variable number of repair persons  Sequential repair; Partial repair (degraded components)  Can model standby spares (hot, warm, cold)  Can model sequence dependencies:  Functional dependencies  Sequence enforcement  Can model imperfect coverage more naturally than combinatorial models  Can model fault/error handling and recovery at a detailed level