SlideShare a Scribd company logo
Stochastic Process and Markov Chain
Discrete Markov Chain
Actuarial Mathematics II
Lecture 4
Dr. Shaiful Anuar
Institute Of Mathematical Sciences
University of Malaya
Lecture 4 Actuarial Mathematics II
Stochastic Process and Markov Chain
Discrete Markov Chain
Table of contents
1 Stochastic Process and Markov Chain
2 Discrete Markov Chain
Example
Example
Example
Example
Lecture 4 Actuarial Mathematics II
Stochastic Process and Markov Chain
Discrete Markov Chain
Stochastic Process and Markov Chain
A collection of random variables of a stochastic process is
normally represented by {Yt, t ∈ T} where t is often referred
to as time and Yt is the process state at time t with Yt ∈ S.
S is known as the state space, set of all possible values of the
random variables.
The stochastic process can be categorized as follows
depending on T:
discrete-time stochastic process : when T is of
discrete/countable type
continuous-time stochastic process : when T is of
continuous/interval type
Similarly a Markov chain can be of discrete or continuous
type.
Lecture 4 Actuarial Mathematics II
Stochastic Process and Markov Chain
Discrete Markov Chain
Markov chain is a stochastic process with the following
properties:
(i) P(Yt+1 ∈ S|Yt = i) = 1
(ii) P(Yt+1 = j|Yn = i, Yn−1 = in−1, · · · , Y0 = i0)
= Pr(Yt+1 = j|Yt = i)
An important characteristics of a Markov chain as expressed
in property (ii) above is the memoryless property, that is,
the transition probabilities only depend on the current state.
In insurance, we apply this concept in multi-state models
which uses the concept of Markov chain to specify the
probabilities of moving to various states in the model.
The probabilities are known as transition probabilities.
Lecture 4 Actuarial Mathematics II
Stochastic Process and Markov Chain
Discrete Markov Chain
Several popular multi-state models are given below:
(i) The alive-dead model
Alive (1) Dead (2)
(ii) The double indemnity/accidental death model
Alive (1)
Dead - Accident (2)
Dead - Other causes (3)
Lecture 4 Actuarial Mathematics II
Stochastic Process and Markov Chain
Discrete Markov Chain
(iii) The permanent disability model
Healthy (1) Disabled (2)
Dead (3)
(iv) The disability income model
Healthy (1) Sick (2)
Dead (3)
Lecture 4 Actuarial Mathematics II
Stochastic Process and Markov Chain
Discrete Markov Chain
Example
Example
Example
Example
Discrete Markov Chain
The most common model that we have come across is the
single decrement model with a two-state transition.
The alive-dead model
Alive (1) Dead (2)
It is common to represent the probability of transition in the
form of a matrix. For the above example, assume that the
probability of moving from state ”Alive” and ”Death” equals
0.4. Then, the matrix can be represented as follows:

1 2
1 0.6 0.4
2 0 1

Lecture 4 Actuarial Mathematics II
Stochastic Process and Markov Chain
Discrete Markov Chain
Example
Example
Example
Example
One-step transition probability is the probability of moving
from one state to another in a single step. It is denoted by:
Q
(i,j)
t = P(Yt+1 = j|Yt = i)
kth-step transition probability is the probability of moving
from one state to another in a single step. It is denoted by:
kQ
(i,j)
t =
X
k
Q
(i,k)
t · Q
(k,j)
t+1
As mentioned earlier, the transition probabilities is given in a
matrix known as transition matrix.
If the transition probabilities does not vary with time, it is
referred to as homogeneous Markov chain.
If the transition probabilities varies with time, it is referred to
as non-homogeneous Markov chain.
Lecture 4 Actuarial Mathematics II
Stochastic Process and Markov Chain
Discrete Markov Chain
Example
Example
Example
Example
For a homogeneous Markov chain the transition probability is
independent of t.
kQ
(i,j)
t =
X
k
Q(i,k)
· Q(k,j)
Therefore if Q is the transition matrix, then the kth-step
transition matrix is simply given by:
kQt = Qk
Lecture 4 Actuarial Mathematics II
Stochastic Process and Markov Chain
Discrete Markov Chain
Example
Example
Example
Example
Example 1
For a homogeneous Markov chain, the transition probability matrix
is given by

1 2
1 0.4 0.6
2 0.8 0.2

Calculate 3Q(2,1).
Lecture 4 Actuarial Mathematics II
Stochastic Process and Markov Chain
Discrete Markov Chain
Example
Example
Example
Example
Solution
Lecture 4 Actuarial Mathematics II
Stochastic Process and Markov Chain
Discrete Markov Chain
Example
Example
Example
Example
Example 2
An auto insured who was claim-free during a policy period will
claim-free during the next policy period with probability 0.9. An
auto insured who was not claim-free during a policy period will be
claim-free during the next policy period with probability 0.7. What
is the probability that an insured who was claim-free during the
policy period 0 will incur a claim during period 3?
Lecture 4 Actuarial Mathematics II
Stochastic Process and Markov Chain
Discrete Markov Chain
Example
Example
Example
Example
Solution
Lecture 4 Actuarial Mathematics II
Stochastic Process and Markov Chain
Discrete Markov Chain
Example
Example
Example
Example
Example 3
Consider a homogeneous Markov model with three states, Healthy
(0), Disabled (1) and Dead (2).
(i) The annual transition matrix is given by


0 1 2
0 0.7 0.2 0.10
1 0.10 0.65 0.25
2 0 0 1


(ii) There are 100 lives at the start, all Healthy. Their future
states are independent.
(iii) Assume that all lives have the same age at start.
Calculate the variance of the number of the original lives who die
within the first two years.
Lecture 4 Actuarial Mathematics II
Stochastic Process and Markov Chain
Discrete Markov Chain
Example
Example
Example
Example
Solution
Lecture 4 Actuarial Mathematics II
Stochastic Process and Markov Chain
Discrete Markov Chain
Example
Example
Example
Example
Example 4
A life insurance policy waives premium upon disability of the
insured. The policy is modeled as a homogeneous Markov chain
with the three states: active, disabled, gone. The annual transition
matrix is


1 2 3
1 0.75 0.15 0.10
2 0.50 0.30 0.20
3 0 0 1


Currently 90% of the insureds are active and 10% are disabled.
(a) Calculate the percentage of the current population of insureds
that are (1) active, (2) disabled and (3) gone at the end of
three years.
(b) Calculate the probability that a currently disabled life will be
active at the end of three years.
Lecture 4 Actuarial Mathematics II
Stochastic Process and Markov Chain
Discrete Markov Chain
Example
Example
Example
Example
Solution
Lecture 4 Actuarial Mathematics II
Stochastic Process and Markov Chain
Discrete Markov Chain
Example
Example
Example
Example
Solution
Lecture 4 Actuarial Mathematics II

More Related Content

PPT
markov chain.ppt
PPT
Markov analysis
PPTX
Markov chain
PPTX
Stat 2153 Stochastic Process and Markov chain
PDF
12 Machine Learning Supervised Hidden Markov Chains
PPT
Markov chains1
PPTX
Machine learning fundamental concepts in detail
PPT
Markov Chains
markov chain.ppt
Markov analysis
Markov chain
Stat 2153 Stochastic Process and Markov chain
12 Machine Learning Supervised Hidden Markov Chains
Markov chains1
Machine learning fundamental concepts in detail
Markov Chains

Similar to Lecture4 SIQ3003.pdf (20)

PPTX
02 - Discrete-Time Markov Models - incomplete.pptx
PPTX
Markov Model chains
PPT
2 discrete markov chain
DOCX
IE 423 page 1 of 1 •••••••••••••••••••••••••••••••••••••••.docx
PPTX
Stochastic matrices
PPT
Markov Processes.ppt
PPTX
Markov chain-model
PPTX
Teradata Analytics Meet @ Linkedin - May 2017
PDF
17-markov-chains.pdf
PDF
CS-438 COMPUTER SYSTEM MODELINGWK9+10LEC17-19.pdf
PPTX
Advanced operation research
PPTX
Lecture 6 - Marcov Chain introduction.pptx
PDF
Lesson 11: Markov Chains
PPTX
Markov chain
PDF
PPT
M A R K O V C H A I N
PPT
meng.ppt
PPTX
Stochastic Processes Homework Help
PPTX
Markov presentation
PDF
Markor chain presentation
02 - Discrete-Time Markov Models - incomplete.pptx
Markov Model chains
2 discrete markov chain
IE 423 page 1 of 1 •••••••••••••••••••••••••••••••••••••••.docx
Stochastic matrices
Markov Processes.ppt
Markov chain-model
Teradata Analytics Meet @ Linkedin - May 2017
17-markov-chains.pdf
CS-438 COMPUTER SYSTEM MODELINGWK9+10LEC17-19.pdf
Advanced operation research
Lecture 6 - Marcov Chain introduction.pptx
Lesson 11: Markov Chains
Markov chain
M A R K O V C H A I N
meng.ppt
Stochastic Processes Homework Help
Markov presentation
Markor chain presentation
Ad

Recently uploaded (20)

DOCX
BUSINESS PERFORMANCE SITUATION AND PERFORMANCE EVALUATION OF FELIX HOTEL IN H...
PDF
Fintech Regulatory Sandbox: Lessons Learned and Future Prospects
PPTX
Role and functions of International monetary fund.pptx
PDF
Statistics for Management and Economics Keller 10th Edition by Gerald Keller ...
PPTX
Module5_Session1 (mlzrkfbbbbbbbbbbbz1).pptx
PDF
Principal of magaement is good fundamentals in economics
PDF
5a An Age-Based, Three-Dimensional Distribution Model Incorporating Sequence ...
PDF
GVCParticipation_Automation_Climate_India
PDF
Truxton Capital: Middle Market Quarterly Review - August 2025
PDF
2012_The dark side of valuation a jedi guide to valuing difficult to value co...
PDF
The Right Social Media Strategy Can Transform Your Business
PPTX
ML Credit Scoring of Thin-File Borrowers
PDF
How to join illuminati agent in Uganda Kampala call 0782561496/0756664682
PPTX
OAT_ORI_Fed Independence_August 2025.pptx
PDF
2a A Dynamic and Adaptive Approach to Distribution Planning and Monitoring JF...
PPTX
Group Presentation Development Econ and Envi..pptx
PDF
Financial discipline for educational purpose
PPTX
Machine Learning (ML) is a branch of Artificial Intelligence (AI)
PDF
Pitch Deck.pdf .pdf all about finance in
BUSINESS PERFORMANCE SITUATION AND PERFORMANCE EVALUATION OF FELIX HOTEL IN H...
Fintech Regulatory Sandbox: Lessons Learned and Future Prospects
Role and functions of International monetary fund.pptx
Statistics for Management and Economics Keller 10th Edition by Gerald Keller ...
Module5_Session1 (mlzrkfbbbbbbbbbbbz1).pptx
Principal of magaement is good fundamentals in economics
5a An Age-Based, Three-Dimensional Distribution Model Incorporating Sequence ...
GVCParticipation_Automation_Climate_India
Truxton Capital: Middle Market Quarterly Review - August 2025
2012_The dark side of valuation a jedi guide to valuing difficult to value co...
The Right Social Media Strategy Can Transform Your Business
ML Credit Scoring of Thin-File Borrowers
How to join illuminati agent in Uganda Kampala call 0782561496/0756664682
OAT_ORI_Fed Independence_August 2025.pptx
2a A Dynamic and Adaptive Approach to Distribution Planning and Monitoring JF...
Group Presentation Development Econ and Envi..pptx
Financial discipline for educational purpose
Machine Learning (ML) is a branch of Artificial Intelligence (AI)
Pitch Deck.pdf .pdf all about finance in
Ad

Lecture4 SIQ3003.pdf

  • 1. Stochastic Process and Markov Chain Discrete Markov Chain Actuarial Mathematics II Lecture 4 Dr. Shaiful Anuar Institute Of Mathematical Sciences University of Malaya Lecture 4 Actuarial Mathematics II
  • 2. Stochastic Process and Markov Chain Discrete Markov Chain Table of contents 1 Stochastic Process and Markov Chain 2 Discrete Markov Chain Example Example Example Example Lecture 4 Actuarial Mathematics II
  • 3. Stochastic Process and Markov Chain Discrete Markov Chain Stochastic Process and Markov Chain A collection of random variables of a stochastic process is normally represented by {Yt, t ∈ T} where t is often referred to as time and Yt is the process state at time t with Yt ∈ S. S is known as the state space, set of all possible values of the random variables. The stochastic process can be categorized as follows depending on T: discrete-time stochastic process : when T is of discrete/countable type continuous-time stochastic process : when T is of continuous/interval type Similarly a Markov chain can be of discrete or continuous type. Lecture 4 Actuarial Mathematics II
  • 4. Stochastic Process and Markov Chain Discrete Markov Chain Markov chain is a stochastic process with the following properties: (i) P(Yt+1 ∈ S|Yt = i) = 1 (ii) P(Yt+1 = j|Yn = i, Yn−1 = in−1, · · · , Y0 = i0) = Pr(Yt+1 = j|Yt = i) An important characteristics of a Markov chain as expressed in property (ii) above is the memoryless property, that is, the transition probabilities only depend on the current state. In insurance, we apply this concept in multi-state models which uses the concept of Markov chain to specify the probabilities of moving to various states in the model. The probabilities are known as transition probabilities. Lecture 4 Actuarial Mathematics II
  • 5. Stochastic Process and Markov Chain Discrete Markov Chain Several popular multi-state models are given below: (i) The alive-dead model Alive (1) Dead (2) (ii) The double indemnity/accidental death model Alive (1) Dead - Accident (2) Dead - Other causes (3) Lecture 4 Actuarial Mathematics II
  • 6. Stochastic Process and Markov Chain Discrete Markov Chain (iii) The permanent disability model Healthy (1) Disabled (2) Dead (3) (iv) The disability income model Healthy (1) Sick (2) Dead (3) Lecture 4 Actuarial Mathematics II
  • 7. Stochastic Process and Markov Chain Discrete Markov Chain Example Example Example Example Discrete Markov Chain The most common model that we have come across is the single decrement model with a two-state transition. The alive-dead model Alive (1) Dead (2) It is common to represent the probability of transition in the form of a matrix. For the above example, assume that the probability of moving from state ”Alive” and ”Death” equals 0.4. Then, the matrix can be represented as follows: 1 2 1 0.6 0.4 2 0 1 Lecture 4 Actuarial Mathematics II
  • 8. Stochastic Process and Markov Chain Discrete Markov Chain Example Example Example Example One-step transition probability is the probability of moving from one state to another in a single step. It is denoted by: Q (i,j) t = P(Yt+1 = j|Yt = i) kth-step transition probability is the probability of moving from one state to another in a single step. It is denoted by: kQ (i,j) t = X k Q (i,k) t · Q (k,j) t+1 As mentioned earlier, the transition probabilities is given in a matrix known as transition matrix. If the transition probabilities does not vary with time, it is referred to as homogeneous Markov chain. If the transition probabilities varies with time, it is referred to as non-homogeneous Markov chain. Lecture 4 Actuarial Mathematics II
  • 9. Stochastic Process and Markov Chain Discrete Markov Chain Example Example Example Example For a homogeneous Markov chain the transition probability is independent of t. kQ (i,j) t = X k Q(i,k) · Q(k,j) Therefore if Q is the transition matrix, then the kth-step transition matrix is simply given by: kQt = Qk Lecture 4 Actuarial Mathematics II
  • 10. Stochastic Process and Markov Chain Discrete Markov Chain Example Example Example Example Example 1 For a homogeneous Markov chain, the transition probability matrix is given by 1 2 1 0.4 0.6 2 0.8 0.2 Calculate 3Q(2,1). Lecture 4 Actuarial Mathematics II
  • 11. Stochastic Process and Markov Chain Discrete Markov Chain Example Example Example Example Solution Lecture 4 Actuarial Mathematics II
  • 12. Stochastic Process and Markov Chain Discrete Markov Chain Example Example Example Example Example 2 An auto insured who was claim-free during a policy period will claim-free during the next policy period with probability 0.9. An auto insured who was not claim-free during a policy period will be claim-free during the next policy period with probability 0.7. What is the probability that an insured who was claim-free during the policy period 0 will incur a claim during period 3? Lecture 4 Actuarial Mathematics II
  • 13. Stochastic Process and Markov Chain Discrete Markov Chain Example Example Example Example Solution Lecture 4 Actuarial Mathematics II
  • 14. Stochastic Process and Markov Chain Discrete Markov Chain Example Example Example Example Example 3 Consider a homogeneous Markov model with three states, Healthy (0), Disabled (1) and Dead (2). (i) The annual transition matrix is given by   0 1 2 0 0.7 0.2 0.10 1 0.10 0.65 0.25 2 0 0 1   (ii) There are 100 lives at the start, all Healthy. Their future states are independent. (iii) Assume that all lives have the same age at start. Calculate the variance of the number of the original lives who die within the first two years. Lecture 4 Actuarial Mathematics II
  • 15. Stochastic Process and Markov Chain Discrete Markov Chain Example Example Example Example Solution Lecture 4 Actuarial Mathematics II
  • 16. Stochastic Process and Markov Chain Discrete Markov Chain Example Example Example Example Example 4 A life insurance policy waives premium upon disability of the insured. The policy is modeled as a homogeneous Markov chain with the three states: active, disabled, gone. The annual transition matrix is   1 2 3 1 0.75 0.15 0.10 2 0.50 0.30 0.20 3 0 0 1   Currently 90% of the insureds are active and 10% are disabled. (a) Calculate the percentage of the current population of insureds that are (1) active, (2) disabled and (3) gone at the end of three years. (b) Calculate the probability that a currently disabled life will be active at the end of three years. Lecture 4 Actuarial Mathematics II
  • 17. Stochastic Process and Markov Chain Discrete Markov Chain Example Example Example Example Solution Lecture 4 Actuarial Mathematics II
  • 18. Stochastic Process and Markov Chain Discrete Markov Chain Example Example Example Example Solution Lecture 4 Actuarial Mathematics II