SlideShare a Scribd company logo
2
Most read
3
Most read
4
Most read
Monte Carlo Methods
~Theory and Applications~
Michael Bell
Initial Assumptions
We have a computational means of generating
random numbers.
We will handle only cases in which P(x) can be
evaluated for all possible x in the domain.
We know some basic mathematics.
(calculus, statistics, algebra)
We want to perform high dimensional integration
(without lengthy/impossible calculations)
The Challenge of Sampling P(x)
We wish to sample
Problem 1: We usually don't know the constant Z
Problem 2: If we know Z, how do we draw the
samples (in higher dimensions)?
P(x)=
ˇP(x)
Z
The Curse of Dimensionality
Determine the normalization constant
Now add more parameters... :(
Uniform Sampling
We can't feasibly sample everywhere in the
parameter space.
Instead, let's try to find it by drawing random
samples.
The Procedure
1) Draw random samples uniformly from
parameter space
2) Evaluate P*(x) at those points.
3) Evaluate
4) Estimate E[x] by ZR=∑
r=1
R
f (x
(r)
)
ˇP(x
(r)
)
ZR
ZR=∑
r=1
R
ˇP(x(r)
)
Great...but it still sucks.
If P(x) is uniform, this method will work very well.
However, for a nonuniform distribution the
probability is typically confined to a small volume
within the parameter space. Hence we will require
a large number of samples making this method
essentially useless.
What will we do?!
It's a bird...
it's a plane...
it's a series of various
MONTE CARLO METHODS
Single Dimensional
Algorithms/Procedures
Importance Sampling
Rejection Sampling
Metropolis-Hastings Method
Goals
Generate random samples from a distribution
To estimate the expectation of functions under
such a distribution
sing the estimator:
^E [x]=
1
R
∑
r
f (x
(r)
)
E [x]=⟨f (x)⟩≡∫P (x)f (x)dx1...dxn
{x(r )
}r=1
R
Why should you care?
We can write the variance as:
The variance will decrease as Var[X]/R
The accuracy of this method is independent of the
dimensionality of the space sampled.
Var(x)=⟨⟨f (x)⟩−f (x)⟩2
∫P( x)f (x)⟨⟨f (x)⟩−f (x)⟩2
dx1...dxn
Importance Sampling
When P(x) is too complicated, we can choose an
alternative “proposal” density, Q(x).
1) Generate R samples from Q(x).
2) Weight them according to their “importance”
(i.e. their value relative to P(x))
3) Evaluate the Estimator
wr ≡
ˇP(x)
ˇQ (x)
^E[x]=
∑ wr f (x
(r)
)
∑ wr
Rejection Sampling
Again, we choose a proposal density, Q(x).
1) Determine a constant, c, such that
2) Generate a random number x from Q(x).
3) Evaluate and generate a uniformly
distributed sample, u, in the interval
4) Evaluate
5) accept if u < reject if u >
∀ x,c ˇQ (x)> ˇP(x)
c ˇQ(x)
[0,c ˇQ(x)]
ˇP(x)
ˇP(x) ˇP(x)
Metropolis-Hastings Method
What if we can't find a “nice” proposal density?
Instead, generate random samples from an
“evolving” proposal density that more closely
approximates P(x) with each additional sample.
This implies the need for an iterative process,
which will require a Markov Chain (hence MCMC).
Metropolis-Hastings Algorithm
Let f(x) be proportional to the desired P(x)
Choose an arbitrary first sample, x, and an arbitrary
density Q(x|y) (typically a Gaussian)
For each iteration:
1) Generate a candidate x' from Q(x|y).
2) Calculate the acceptance ratio a=f(x')/f(x).
3) If a>1, calculate a=f(x'')/f(x') and so on.
4) If a<1, generate another candidate x' from Q(x|y).
A Summary of the Algorithms
We can implement Monte Carlo Methods on any
distribution, P(x), that can be expressed as
Each converts integrals to sums as
∫f (x)P(x)dx=
1
R
∑
r
f (x
(r)
)
P(x)=
ˇP(x)
Z
An Elementary Application
TASK: Estimate Pi Using a Monte Carlo Method
Throw N of darts at a unit square.
(Randomly sample [0,1]x[0,1])
Count a “throw” as a “hit” if the sample lies within
a quarter unit circle.
Area of quarter unit circle = pi/4
pi/4~(hits/throws)
pi~4*(hits/throws)
Rejection Sampling Python Code
from random import random
from math import sqrt
darts = 10000000
hits = 0
throws = 0
for I in range (1,darts):
throws += 1
x = random()
y = random()
dist = sqrt(x*x+y*y)
if dist <= 1.0:
hits += 1
pi = 4*(hits/throws)
print “pi = %s” %(pi)
Rejection Sampling Code Results
Estimates
3.1419759142
3.1419711413
3.1420311142
Avg. Estimate
3.1417680475
Real Value:
3.1415926535
The Ising Model (Magnetization)
Goal: Minimize the energy
(i.e. the hamiltonian)
Note: the energy is lower when
spins are aligned.
We will use the mean field (nearest
neighbor) approximation
Ising MCMC Implementation
Ising MCMC Implementation
Results
FYI: Results taken from website:
http://farside.ph.utexas.edu/teaching/329/lectures/nod
Given sufficient testing, this is what we'd find.

More Related Content

PPT
Monte carlo
PPTX
Monte carlo simulation
PPT
Monte Carlo Simulations
PPTX
Monte Carlo Simulation
PPT
Introduction to Simulation
PPTX
The monte carlo method
PPTX
Sampling Distributions and Estimators
PPTX
Probability
Monte carlo
Monte carlo simulation
Monte Carlo Simulations
Monte Carlo Simulation
Introduction to Simulation
The monte carlo method
Sampling Distributions and Estimators
Probability

What's hot (20)

PPTX
Introduction to Maximum Likelihood Estimator
PPT
Fundamentals Probability 08072009
PPT
Random number generation
PPTX
Probability
PPTX
Markov chain-model
PPTX
Binomial distribution
PPT
Hypothesis
PPTX
Chapter 2 Simple Linear Regression Model.pptx
PPTX
Random Experiment, Sample Space, Events
PPTX
Unit 1-probability
PDF
Gentle Introduction to Dirichlet Processes
PDF
Monte carlo simulation
PPT
probability
PPT
Sampling and sampling distribution
PPTX
Introduction to mathematical modelling
PPSX
Teaching learning based optimization technique
PPTX
Modelling and simulation
PPTX
Probability Distribution
PPTX
Unit 3 random number generation, random-variate generation
PPTX
Fixed Point Theorems
Introduction to Maximum Likelihood Estimator
Fundamentals Probability 08072009
Random number generation
Probability
Markov chain-model
Binomial distribution
Hypothesis
Chapter 2 Simple Linear Regression Model.pptx
Random Experiment, Sample Space, Events
Unit 1-probability
Gentle Introduction to Dirichlet Processes
Monte carlo simulation
probability
Sampling and sampling distribution
Introduction to mathematical modelling
Teaching learning based optimization technique
Modelling and simulation
Probability Distribution
Unit 3 random number generation, random-variate generation
Fixed Point Theorems
Ad

Viewers also liked (10)

PDF
Particle Filters and Applications in Computer Vision
PPTX
Monte carlo simulation
PDF
How to perform a Monte Carlo simulation
PDF
An Importance Sampling Approach to Integrate Expert Knowledge When Learning B...
PDF
Particle Filter
PPT
History of Mathematics
PPT
Monte Carlo Simulation Methods
PDF
Strata 2013: Tutorial-- How to Create Predictive Models in R using Ensembles
PDF
Lecture: Monte Carlo Methods
PPT
Monte Carlo Simulation
Particle Filters and Applications in Computer Vision
Monte carlo simulation
How to perform a Monte Carlo simulation
An Importance Sampling Approach to Integrate Expert Knowledge When Learning B...
Particle Filter
History of Mathematics
Monte Carlo Simulation Methods
Strata 2013: Tutorial-- How to Create Predictive Models in R using Ensembles
Lecture: Monte Carlo Methods
Monte Carlo Simulation
Ad

Similar to Monte Carlo Methods (20)

PPTX
Monte Carlo Berkeley.pptx
ODT
Probability and random processes project based learning template.pdf
PDF
PDF
A bit about мcmc
PDF
2012 mdsp pr04 monte carlo
PPTX
PRML Chapter 11
PDF
PhysicsSIG2008-01-Seneviratne
PDF
Talk at 2013 WSC, ISI Conference in Hong Kong, August 26, 2013
PDF
thesis_final_draft
PDF
Metropolis-Hastings MCMC Short Tutorial
PDF
PRML Reading Chapter 11 - Sampling Method
PPTX
Intuition behind Markov Chain Monte Carlo
PPT
Bayesian phylogenetic inference_big4_ws_2016-10-10
PDF
Multiple estimators for Monte Carlo approximations
PPT
MAchin learning graphoalmodesland bayesian netorls
PDF
Firefly exact MCMC for Big Data
PDF
Paris Lecture 4: Practical issues in Bayesian modeling
PDF
Sampling and Markov Chain Monte Carlo Techniques
PDF
Rao-Blackwellisation schemes for accelerating Metropolis-Hastings algorithms
Monte Carlo Berkeley.pptx
Probability and random processes project based learning template.pdf
A bit about мcmc
2012 mdsp pr04 monte carlo
PRML Chapter 11
PhysicsSIG2008-01-Seneviratne
Talk at 2013 WSC, ISI Conference in Hong Kong, August 26, 2013
thesis_final_draft
Metropolis-Hastings MCMC Short Tutorial
PRML Reading Chapter 11 - Sampling Method
Intuition behind Markov Chain Monte Carlo
Bayesian phylogenetic inference_big4_ws_2016-10-10
Multiple estimators for Monte Carlo approximations
MAchin learning graphoalmodesland bayesian netorls
Firefly exact MCMC for Big Data
Paris Lecture 4: Practical issues in Bayesian modeling
Sampling and Markov Chain Monte Carlo Techniques
Rao-Blackwellisation schemes for accelerating Metropolis-Hastings algorithms

Recently uploaded (20)

PPTX
Microbiology with diagram medical studies .pptx
PDF
AlphaEarth Foundations and the Satellite Embedding dataset
PDF
Mastering Bioreactors and Media Sterilization: A Complete Guide to Sterile Fe...
PPTX
Comparative Structure of Integument in Vertebrates.pptx
PDF
The scientific heritage No 166 (166) (2025)
PDF
Biophysics 2.pdffffffffffffffffffffffffff
PDF
ELS_Q1_Module-11_Formation-of-Rock-Layers_v2.pdf
PPTX
Derivatives of integument scales, beaks, horns,.pptx
PPT
protein biochemistry.ppt for university classes
PDF
An interstellar mission to test astrophysical black holes
PPTX
7. General Toxicologyfor clinical phrmacy.pptx
PDF
HPLC-PPT.docx high performance liquid chromatography
PPTX
Vitamins & Minerals: Complete Guide to Functions, Food Sources, Deficiency Si...
PPTX
ECG_Course_Presentation د.محمد صقران ppt
PDF
diccionario toefl examen de ingles para principiante
PDF
IFIT3 RNA-binding activity primores influenza A viruz infection and translati...
PDF
Unveiling a 36 billion solar mass black hole at the centre of the Cosmic Hors...
PPTX
ognitive-behavioral therapy, mindfulness-based approaches, coping skills trai...
PDF
VARICELLA VACCINATION: A POTENTIAL STRATEGY FOR PREVENTING MULTIPLE SCLEROSIS
PPTX
microscope-Lecturecjchchchchcuvuvhc.pptx
Microbiology with diagram medical studies .pptx
AlphaEarth Foundations and the Satellite Embedding dataset
Mastering Bioreactors and Media Sterilization: A Complete Guide to Sterile Fe...
Comparative Structure of Integument in Vertebrates.pptx
The scientific heritage No 166 (166) (2025)
Biophysics 2.pdffffffffffffffffffffffffff
ELS_Q1_Module-11_Formation-of-Rock-Layers_v2.pdf
Derivatives of integument scales, beaks, horns,.pptx
protein biochemistry.ppt for university classes
An interstellar mission to test astrophysical black holes
7. General Toxicologyfor clinical phrmacy.pptx
HPLC-PPT.docx high performance liquid chromatography
Vitamins & Minerals: Complete Guide to Functions, Food Sources, Deficiency Si...
ECG_Course_Presentation د.محمد صقران ppt
diccionario toefl examen de ingles para principiante
IFIT3 RNA-binding activity primores influenza A viruz infection and translati...
Unveiling a 36 billion solar mass black hole at the centre of the Cosmic Hors...
ognitive-behavioral therapy, mindfulness-based approaches, coping skills trai...
VARICELLA VACCINATION: A POTENTIAL STRATEGY FOR PREVENTING MULTIPLE SCLEROSIS
microscope-Lecturecjchchchchcuvuvhc.pptx

Monte Carlo Methods

  • 1. Monte Carlo Methods ~Theory and Applications~ Michael Bell
  • 2. Initial Assumptions We have a computational means of generating random numbers. We will handle only cases in which P(x) can be evaluated for all possible x in the domain. We know some basic mathematics. (calculus, statistics, algebra) We want to perform high dimensional integration (without lengthy/impossible calculations)
  • 3. The Challenge of Sampling P(x) We wish to sample Problem 1: We usually don't know the constant Z Problem 2: If we know Z, how do we draw the samples (in higher dimensions)? P(x)= ˇP(x) Z
  • 4. The Curse of Dimensionality Determine the normalization constant Now add more parameters... :(
  • 5. Uniform Sampling We can't feasibly sample everywhere in the parameter space. Instead, let's try to find it by drawing random samples.
  • 6. The Procedure 1) Draw random samples uniformly from parameter space 2) Evaluate P*(x) at those points. 3) Evaluate 4) Estimate E[x] by ZR=∑ r=1 R f (x (r) ) ˇP(x (r) ) ZR ZR=∑ r=1 R ˇP(x(r) )
  • 7. Great...but it still sucks. If P(x) is uniform, this method will work very well. However, for a nonuniform distribution the probability is typically confined to a small volume within the parameter space. Hence we will require a large number of samples making this method essentially useless. What will we do?!
  • 8. It's a bird... it's a plane... it's a series of various MONTE CARLO METHODS
  • 10. Goals Generate random samples from a distribution To estimate the expectation of functions under such a distribution sing the estimator: ^E [x]= 1 R ∑ r f (x (r) ) E [x]=⟨f (x)⟩≡∫P (x)f (x)dx1...dxn {x(r ) }r=1 R
  • 11. Why should you care? We can write the variance as: The variance will decrease as Var[X]/R The accuracy of this method is independent of the dimensionality of the space sampled. Var(x)=⟨⟨f (x)⟩−f (x)⟩2 ∫P( x)f (x)⟨⟨f (x)⟩−f (x)⟩2 dx1...dxn
  • 12. Importance Sampling When P(x) is too complicated, we can choose an alternative “proposal” density, Q(x). 1) Generate R samples from Q(x). 2) Weight them according to their “importance” (i.e. their value relative to P(x)) 3) Evaluate the Estimator wr ≡ ˇP(x) ˇQ (x) ^E[x]= ∑ wr f (x (r) ) ∑ wr
  • 13. Rejection Sampling Again, we choose a proposal density, Q(x). 1) Determine a constant, c, such that 2) Generate a random number x from Q(x). 3) Evaluate and generate a uniformly distributed sample, u, in the interval 4) Evaluate 5) accept if u < reject if u > ∀ x,c ˇQ (x)> ˇP(x) c ˇQ(x) [0,c ˇQ(x)] ˇP(x) ˇP(x) ˇP(x)
  • 14. Metropolis-Hastings Method What if we can't find a “nice” proposal density? Instead, generate random samples from an “evolving” proposal density that more closely approximates P(x) with each additional sample. This implies the need for an iterative process, which will require a Markov Chain (hence MCMC).
  • 15. Metropolis-Hastings Algorithm Let f(x) be proportional to the desired P(x) Choose an arbitrary first sample, x, and an arbitrary density Q(x|y) (typically a Gaussian) For each iteration: 1) Generate a candidate x' from Q(x|y). 2) Calculate the acceptance ratio a=f(x')/f(x). 3) If a>1, calculate a=f(x'')/f(x') and so on. 4) If a<1, generate another candidate x' from Q(x|y).
  • 16. A Summary of the Algorithms We can implement Monte Carlo Methods on any distribution, P(x), that can be expressed as Each converts integrals to sums as ∫f (x)P(x)dx= 1 R ∑ r f (x (r) ) P(x)= ˇP(x) Z
  • 17. An Elementary Application TASK: Estimate Pi Using a Monte Carlo Method Throw N of darts at a unit square. (Randomly sample [0,1]x[0,1]) Count a “throw” as a “hit” if the sample lies within a quarter unit circle. Area of quarter unit circle = pi/4 pi/4~(hits/throws) pi~4*(hits/throws)
  • 18. Rejection Sampling Python Code from random import random from math import sqrt darts = 10000000 hits = 0 throws = 0 for I in range (1,darts): throws += 1 x = random() y = random() dist = sqrt(x*x+y*y) if dist <= 1.0: hits += 1 pi = 4*(hits/throws) print “pi = %s” %(pi)
  • 19. Rejection Sampling Code Results Estimates 3.1419759142 3.1419711413 3.1420311142 Avg. Estimate 3.1417680475 Real Value: 3.1415926535
  • 20. The Ising Model (Magnetization) Goal: Minimize the energy (i.e. the hamiltonian) Note: the energy is lower when spins are aligned. We will use the mean field (nearest neighbor) approximation
  • 23. Results FYI: Results taken from website: http://farside.ph.utexas.edu/teaching/329/lectures/nod Given sufficient testing, this is what we'd find.