SlideShare a Scribd company logo
Adaptive Methods for Cubature
Fred J. Hickernell
Department of Applied Mathematics, Illinois Institute of Technology
hickernell@iit.edu mypages.iit.edu/~hickernell
Thanks to Lan Jiang, Tony Jiménez Rugama, Jagadees Rathinavel,
and the rest of the the Guaranteed Automatic Integration Library (GAIL) team
Supported by NSF-DMS-1522687
For more details see H. (2017+), H. et al. (2017+), and Choi et al. (2013–2015)
Problem IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Numerical Examples References
Problem
answer(µ) =?, µ =
ż
Rd
f(x) ν(dx) P Rp
, pµn =
nÿ
i=1
wif(xi)
Given εa, εr, choose n and {answer dependent on pµn to guarantee
answer(µ) ´ {answer ď max(εa, εr |answer(µ)|) (with high probability)
adaptively and automatically
2/10
Problem IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Numerical Examples References
Problem
answer(µ) =?, µ =
ż
Rd
f(x) ν(dx) P Rp
, pµn =
nÿ
i=1
wif(xi)
Given εa, εr, choose n and {answer dependent on pµn to guarantee
answer(µ) ´ {answer ď max(εa, εr |answer(µ)|) (with high probability)
adaptively and automatically
E.g.,
option price =
ż
Rd
payoff(x)
e´xT
Σ´1
x/2
(2π)d/2 |Σ|1/2
dx, Σ = min(i, j)T/d
d
i,j=1
Gaussian probability =
ż
[a,b]
e´xT
Σ´1
x/2
(2π)d/2 |Σ|1/2
dx
Genz (1993)
=
ż
[0,1]d´1
f(x) dx
Sobol’ indexj =
ş
[0,1]2d output(x) ´ output(xj : x1
´j) output(x1
) dx dx1
ş
[0,1]d output(x)2 dx ´
ş
[0,1]d output(x) dx
2
Bayesian estimatej =
ş
Rd βj prob(data|β) probprior(β) dβ
ş
Rd prob(data|β) probprior(β) dβ 2/10
Problem IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Numerical Examples References
Problem
answer(µ) =?, µ =
ż
Rd
f(x) ν(dx) P Rp
, pµn =
nÿ
i=1
wif(xi)
Given εa, εr, choose n and {answer dependent on pµn to guarantee
answer(µ) ´ {answer ď max(εa, εr |answer(µ)|) (with high probability)
adaptively and automatically
If µ P [pµn ´ errn, pµn + errn] (with high probability), then the optimal and
successful choice is
{answer =
ans´ max(εa, εr |ans+|) + ans+ max(εa, εr |ans´|)
max(εa, εr |ans+|) + max(εa, εr |ans´|)
where ans˘ :=
"
sup
inf
*
µ P [pµn ´ errn, pµn + errn]
answer(µ)
provided
|ans+ ´ ans´|
max(εa, εr |ans+|) + max(εa, εr |ans´|)
ď 1 (H. et al., 2017+)
2/10
Problem IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Numerical Examples References
Berry-Esseen Stopping Rule for IID Monte Carlo
µ =
ż
Rd
f(x) ν(dx)
pµn =
1
n
nÿ
i=1
f(xi), xi
IID
„ ν
Need µ P [^µn ´ errn, ^µn + errn]
with high probability
P[|µ ´ ^µn| ď errn] « 99%
for Φ ´
?
n errn /(1.2^σ) = 0.005
by the Central Limit Theorem
where ^σ2
is the sample variation
3/10
Problem IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Numerical Examples References
Berry-Esseen Stopping Rule for IID Monte Carlo
µ =
ż
Rd
f(x) ν(dx)
pµn =
1
n
nÿ
i=1
f(xi), xi
IID
„ ν
Need µ P [^µn ´ errn, ^µn + errn]
with high probability
P[|µ ´ ^µn| ď errn] ě 99%
for Φ ´
?
n errn /(1.2^σnσ
)
+ ∆n(´
?
n errn /(1.2^σnσ
), κmax) = 0.0025
by the Berry-Esseen Inequality
where ^σ2
nσ
is the sample variation using an independent sample, and provided
that kurt(f(X)) ď κmax(nσ) (H. et al., 2013; Jiang, 2016)
3/10
Problem IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Numerical Examples References
Adaptive Low Discrepancy Sampling Cubature
µ =
ż
[0,1]d
f(x) dx
pµn =
1
n
nÿ
i=1
f(xi), xi Sobol’ or lattice
Need µ P [^µn ´ errn, ^µn + errn]
4/10
Problem IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Numerical Examples References
Adaptive Low Discrepancy Sampling Cubature
µ =
ż
[0,1]d
f(x) dx
pµn =
1
n
nÿ
i=1
f(xi), xi Sobol’ or lattice
Need µ P [^µn ´ errn, ^µn + errn]
4/10
Problem IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Numerical Examples References
Adaptive Low Discrepancy Sampling Cubature
µ =
ż
[0,1]d
f(x) dx
pµn =
1
n
nÿ
i=1
f(xi), xi Sobol’ or lattice
Need µ P [^µn ´ errn, ^µn + errn]
Express µ ´ ^µn in terms of the Fourier coefficients of f. Assuming that these
coefficients do not decay erratically, the discrete transform, rfn,κ
(n´1
κ=0
, may be
used to bound the error reliably (H. and Jiménez Rugama, 2016; Jiménez Rugama and
H., 2016; H. et al., 2017+):
|µ ´ ^µn| ď errn := C(n, )
2 ´1ÿ
κ=2 ´1
rfn,κ
4/10
Problem IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Numerical Examples References
Bayesian Cubature—f Is Random
µ =
ż
Rd
f(x) ν(dx)
pµn =
nÿ
i=1
wi f(xi)
Need µ P [^µn ´ errn, ^µn + errn]
with high probability
Assume f „ GP(0, C). Choose the wi to integrate the best estimate of f given the
data txi, f(xi)un
i=1 (Diaconis, 1988; O’Hagan, 1991; Ritter, 2000; Rasmussen and
Ghahramani, 2003)
P[|µ ´ ^µn| ď errn] = 99% for errn = an expression involving C and txi, f(xi)un
i=1
A de-randomized interpretation exists (H., 2017+)
5/10
Problem IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Numerical Examples References
Gaussian Probability
µ =
ż
[a,b]
exp ´1
2 tT
Σ´1
t
a
(2π)d det(Σ)
dt
Genz (1993)
=
ż
[0,1]d´1
f(x) dx
For some typical choice of a, b, Σ, d = 3, εa = 0; µ « 0.6763
Worst 10% Worst 10%
εr Method % Accuracy n Time (s)
IID Monte Carlo 100% 8.1E4 1.8E´2
1E´2 Sobol’ Sampling 100% 1.0E3 5.1E´3
Bayesian Lattice 100% 1.0E3 2.8E´3
IID Monte Carlo 100% 2.0E6 3.8E´1
1E´3 Sobol’ Sampling 100% 2.0E3 7.7E´3
Bayesian Lattice 100% 1.0E3 2.8E´3
1E´4 Sobol’ Sampling 100% 1.6E4 1.8E´2
Bayesian Lattice 100% 8.2E3 1.4E´2
6/10
Problem IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Numerical Examples References
Sobol’ Indices
Y = output(X), where X „ U[0, 1]d
; Sobol’ Indexj(µ) describes how much
coordinate j of input X influences output Y (Sobol’, 1990; 2001):
Sobol’ Indexj(µ) :=
µ1
µ2 ´ µ2
3
, j = 1, . . . , d
µ1 :=
ż
[0,1)2d
[output(x) ´ output(xj : x1
´j)]output(x1
) dx dx1
µ2 :=
ż
[0,1)d
output(x)2
dx, µ3 :=
ż
[0,1)d
output(x) dx.
output(x) = ´x1 + x1x2 ´ x1x2x3 + ¨ ¨ ¨ + x1x2x3x4x5x6 (Bratley et al., 1992)
εa = 1E´3, εr = 0 j 1 2 3 4 5 6
n 65 536 32 768 16 384 16 384 2 048 2 048
Sobol’ Indexj 0.6529 0.1791 0.0370 0.0133 0.0015 0.0015
{Sobol’ Indexj 0.6528 0.1792 0.0363 0.0126 0.0010 0.0012
Sobol’ Indexj(pµn) 0.6492 0.1758 0.0308 0.0083 0.0018 0.0039
7/10
Thank you
Slides available at
www.slideshare.net/fjhickernell/siam-cse-2017-talk
Problem IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Numerical Examples References
References I
Bratley, P., B. L. Fox, and H. Niederreiter. 1992. Implementation and tests of low-discrepancy
sequences, ACM Trans. Model. Comput. Simul. 2, 195–213.
Choi, S.-C. T., Y. Ding, F. J. H., L. Jiang, Ll. A. Jiménez Rugama, X. Tong, Y. Zhang, and X. Zhou.
2013–2015. GAIL: Guaranteed Automatic Integration Library (versions 1.0–2.1).
Cools, R. and D. Nuyens (eds.) 2016. Monte Carlo and quasi-Monte Carlo methods: MCQMC,
Leuven, Belgium, April 2014, Springer Proceedings in Mathematics and Statistics, vol. 163,
Springer-Verlag, Berlin.
Diaconis, P. 1988. Bayesian numerical analysis, Statistical decision theory and related topics IV,
Papers from the 4th Purdue symp., West Lafayette, Indiana 1986, pp. 163–175.
Genz, A. 1993. Comparison of methods for the computation of multivariate normal probabilities,
Computing Science and Statistics 25, 400–405.
H., F. J. 2017+. Error analysis of quasi-Monte Carlo methods. submitted for publication,
arXiv:1702.01487.
H., F. J., L. Jiang, Y. Liu, and A. B. Owen. 2013. Guaranteed conservative fixed width confidence
intervals via Monte Carlo sampling, Monte Carlo and quasi-Monte Carlo methods 2012, pp. 105–128.
H., F. J. and Ll. A. Jiménez Rugama. 2016. Reliable adaptive cubature using digital sequences,
Monte Carlo and quasi-Monte Carlo methods: MCQMC, Leuven, Belgium, April 2014, pp. 367–383.
arXiv:1410.8615 [math.NA].
9/10
Problem IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Numerical Examples References
References II
H., F. J., Ll. A. Jiménez Rugama, and D. Li. 2017+. Adaptive quasi-Monte Carlo methods. submitted
for publication, arXiv:1702.01491 [math.NA].
Jiang, L. 2016. Guaranteed adaptive Monte Carlo methods for estimating means of random
variables, Ph.D. Thesis.
Jiménez Rugama, Ll. A. and F. J. H. 2016. Adaptive multidimensional integration based on rank-1
lattices, Monte Carlo and quasi-Monte Carlo methods: MCQMC, Leuven, Belgium, April 2014,
pp. 407–422. arXiv:1411.1966.
O’Hagan, A. 1991. Bayes-Hermite quadrature, J. Statist. Plann. Inference 29, 245–260.
Rasmussen, C. E. and Z. Ghahramani. 2003. Bayesian Monte Carlo, Advances in Neural Information
Processing Systems, pp. 489–496.
Ritter, K. 2000. Average-case analysis of numerical problems, Lecture Notes in Mathematics,
vol. 1733, Springer-Verlag, Berlin.
Sobol’, I. M. 1990. On sensitivity estimation for nonlinear mathematical models, Matem. Mod. 2,
no. 1, 112–118.
. 2001. Global sensitivity indices for nonlinear mathematical models and their monte carlo
estimates, Math. Comput. Simul. 55, no. 1-3, 271–280.
10/10

More Related Content

PDF
Georgia Tech 2017 March Talk
PDF
Tulane March 2017 Talk
PDF
MCQMC 2016 Tutorial
PDF
Mines April 2017 Colloquium
PDF
QMC Opening Workshop, High Accuracy Algorithms for Interpolating and Integrat...
PDF
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
PDF
Habilitation à diriger des recherches
PDF
ABC convergence under well- and mis-specified models
Georgia Tech 2017 March Talk
Tulane March 2017 Talk
MCQMC 2016 Tutorial
Mines April 2017 Colloquium
QMC Opening Workshop, High Accuracy Algorithms for Interpolating and Integrat...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Habilitation à diriger des recherches
ABC convergence under well- and mis-specified models

What's hot (20)

PDF
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
PDF
Multiple estimators for Monte Carlo approximations
PDF
NCE, GANs & VAEs (and maybe BAC)
PDF
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
PDF
Gtti 10032021
PDF
Macrocanonical models for texture synthesis
PDF
Inference in generative models using the Wasserstein distance [[INI]
PDF
Approximating Bayes Factors
PDF
Can we estimate a constant?
PDF
Monte Carlo in Montréal 2017
PDF
Big model, big data
PDF
Maximum likelihood estimation of regularisation parameters in inverse problem...
PDF
Intro to ABC
PDF
Slides: On the Chi Square and Higher-Order Chi Distances for Approximating f-...
PDF
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
PDF
On learning statistical mixtures maximizing the complete likelihood
PDF
QMC: Operator Splitting Workshop, Proximal Algorithms in Probability Spaces -...
PDF
Approximate Bayesian model choice via random forests
PDF
Continuous and Discrete-Time Analysis of SGD
PDF
Quantitative Propagation of Chaos for SGD in Wide Neural Networks
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Multiple estimators for Monte Carlo approximations
NCE, GANs & VAEs (and maybe BAC)
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Gtti 10032021
Macrocanonical models for texture synthesis
Inference in generative models using the Wasserstein distance [[INI]
Approximating Bayes Factors
Can we estimate a constant?
Monte Carlo in Montréal 2017
Big model, big data
Maximum likelihood estimation of regularisation parameters in inverse problem...
Intro to ABC
Slides: On the Chi Square and Higher-Order Chi Distances for Approximating f-...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
On learning statistical mixtures maximizing the complete likelihood
QMC: Operator Splitting Workshop, Proximal Algorithms in Probability Spaces -...
Approximate Bayesian model choice via random forests
Continuous and Discrete-Time Analysis of SGD
Quantitative Propagation of Chaos for SGD in Wide Neural Networks
Ad

Viewers also liked (11)

PDF
IIT Scholarship Weekend Average Talk, 2017 February 9
PDF
Uncertainty and Sensitivity Analysis using HPC and HTC
PDF
Application of OpenSees in Reliability-based Design Optimization of Structures
PDF
Introduction to OpenSees by Frank McKenna
PDF
Geotechnical Examples using OpenSees
PDF
Monte Carlo Statistical Methods
PPT
Monte carlo
PPTX
Applying Monte Carlo Simulation to Microsoft Project Schedules
PDF
Dynamic Analysis with Examples – Seismic Analysis
PDF
Monte carlo simulation
PDF
High Dimensional Quasi Monte Carlo Method in Finance
IIT Scholarship Weekend Average Talk, 2017 February 9
Uncertainty and Sensitivity Analysis using HPC and HTC
Application of OpenSees in Reliability-based Design Optimization of Structures
Introduction to OpenSees by Frank McKenna
Geotechnical Examples using OpenSees
Monte Carlo Statistical Methods
Monte carlo
Applying Monte Carlo Simulation to Microsoft Project Schedules
Dynamic Analysis with Examples – Seismic Analysis
Monte carlo simulation
High Dimensional Quasi Monte Carlo Method in Finance
Ad

Similar to SIAM CSE 2017 talk (20)

PDF
Automatic Bayesian method for Numerical Integration
PDF
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
PDF
Monte Carlo Methods 2017 July Talk in Montreal
PDF
SIAM - Minisymposium on Guaranteed numerical algorithms
PDF
QMC Error SAMSI Tutorial Aug 2017
PDF
Automatic bayesian cubature
PDF
thesis_final_draft
PDF
CDT 22 slides.pdf
PDF
Sampling method : MCMC
PDF
Adaptive Restore algorithm & importance Monte Carlo
PDF
1100163YifanGuo
PDF
Lecture: Monte Carlo Methods
PDF
Testing for mixtures at BNP 13
PDF
volesti: sampling efficiently from high dimensional distributions
PDF
Rao-Blackwellisation schemes for accelerating Metropolis-Hastings algorithms
PDF
Inference for stochastic differential equations via approximate Bayesian comp...
PDF
Variational inference
PPTX
Bayesian Neural Networks
PDF
IVR - Chapter 1 - Introduction
PDF
Paris Lecture 4: Practical issues in Bayesian modeling
Automatic Bayesian method for Numerical Integration
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
Monte Carlo Methods 2017 July Talk in Montreal
SIAM - Minisymposium on Guaranteed numerical algorithms
QMC Error SAMSI Tutorial Aug 2017
Automatic bayesian cubature
thesis_final_draft
CDT 22 slides.pdf
Sampling method : MCMC
Adaptive Restore algorithm & importance Monte Carlo
1100163YifanGuo
Lecture: Monte Carlo Methods
Testing for mixtures at BNP 13
volesti: sampling efficiently from high dimensional distributions
Rao-Blackwellisation schemes for accelerating Metropolis-Hastings algorithms
Inference for stochastic differential equations via approximate Bayesian comp...
Variational inference
Bayesian Neural Networks
IVR - Chapter 1 - Introduction
Paris Lecture 4: Practical issues in Bayesian modeling

Recently uploaded (20)

PPTX
BIOMOLECULES PPT........................
PDF
bbec55_b34400a7914c42429908233dbd381773.pdf
PPTX
GEN. BIO 1 - CELL TYPES & CELL MODIFICATIONS
PPTX
7. General Toxicologyfor clinical phrmacy.pptx
PPTX
cpcsea ppt.pptxssssssssssssssjjdjdndndddd
PPTX
microscope-Lecturecjchchchchcuvuvhc.pptx
PDF
Phytochemical Investigation of Miliusa longipes.pdf
PPTX
Introduction to Cardiovascular system_structure and functions-1
PPT
protein biochemistry.ppt for university classes
PDF
The scientific heritage No 166 (166) (2025)
PDF
Unveiling a 36 billion solar mass black hole at the centre of the Cosmic Hors...
PDF
HPLC-PPT.docx high performance liquid chromatography
PPT
POSITIONING IN OPERATION THEATRE ROOM.ppt
PPTX
ANEMIA WITH LEUKOPENIA MDS 07_25.pptx htggtftgt fredrctvg
PPTX
2Systematics of Living Organisms t-.pptx
PPTX
Introduction to Fisheries Biotechnology_Lesson 1.pptx
PPTX
Protein & Amino Acid Structures Levels of protein structure (primary, seconda...
PDF
. Radiology Case Scenariosssssssssssssss
PDF
AlphaEarth Foundations and the Satellite Embedding dataset
PPTX
INTRODUCTION TO EVS | Concept of sustainability
BIOMOLECULES PPT........................
bbec55_b34400a7914c42429908233dbd381773.pdf
GEN. BIO 1 - CELL TYPES & CELL MODIFICATIONS
7. General Toxicologyfor clinical phrmacy.pptx
cpcsea ppt.pptxssssssssssssssjjdjdndndddd
microscope-Lecturecjchchchchcuvuvhc.pptx
Phytochemical Investigation of Miliusa longipes.pdf
Introduction to Cardiovascular system_structure and functions-1
protein biochemistry.ppt for university classes
The scientific heritage No 166 (166) (2025)
Unveiling a 36 billion solar mass black hole at the centre of the Cosmic Hors...
HPLC-PPT.docx high performance liquid chromatography
POSITIONING IN OPERATION THEATRE ROOM.ppt
ANEMIA WITH LEUKOPENIA MDS 07_25.pptx htggtftgt fredrctvg
2Systematics of Living Organisms t-.pptx
Introduction to Fisheries Biotechnology_Lesson 1.pptx
Protein & Amino Acid Structures Levels of protein structure (primary, seconda...
. Radiology Case Scenariosssssssssssssss
AlphaEarth Foundations and the Satellite Embedding dataset
INTRODUCTION TO EVS | Concept of sustainability

SIAM CSE 2017 talk

  • 1. Adaptive Methods for Cubature Fred J. Hickernell Department of Applied Mathematics, Illinois Institute of Technology hickernell@iit.edu mypages.iit.edu/~hickernell Thanks to Lan Jiang, Tony Jiménez Rugama, Jagadees Rathinavel, and the rest of the the Guaranteed Automatic Integration Library (GAIL) team Supported by NSF-DMS-1522687 For more details see H. (2017+), H. et al. (2017+), and Choi et al. (2013–2015)
  • 2. Problem IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Numerical Examples References Problem answer(µ) =?, µ = ż Rd f(x) ν(dx) P Rp , pµn = nÿ i=1 wif(xi) Given εa, εr, choose n and {answer dependent on pµn to guarantee answer(µ) ´ {answer ď max(εa, εr |answer(µ)|) (with high probability) adaptively and automatically 2/10
  • 3. Problem IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Numerical Examples References Problem answer(µ) =?, µ = ż Rd f(x) ν(dx) P Rp , pµn = nÿ i=1 wif(xi) Given εa, εr, choose n and {answer dependent on pµn to guarantee answer(µ) ´ {answer ď max(εa, εr |answer(µ)|) (with high probability) adaptively and automatically E.g., option price = ż Rd payoff(x) e´xT Σ´1 x/2 (2π)d/2 |Σ|1/2 dx, Σ = min(i, j)T/d d i,j=1 Gaussian probability = ż [a,b] e´xT Σ´1 x/2 (2π)d/2 |Σ|1/2 dx Genz (1993) = ż [0,1]d´1 f(x) dx Sobol’ indexj = ş [0,1]2d output(x) ´ output(xj : x1 ´j) output(x1 ) dx dx1 ş [0,1]d output(x)2 dx ´ ş [0,1]d output(x) dx 2 Bayesian estimatej = ş Rd βj prob(data|β) probprior(β) dβ ş Rd prob(data|β) probprior(β) dβ 2/10
  • 4. Problem IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Numerical Examples References Problem answer(µ) =?, µ = ż Rd f(x) ν(dx) P Rp , pµn = nÿ i=1 wif(xi) Given εa, εr, choose n and {answer dependent on pµn to guarantee answer(µ) ´ {answer ď max(εa, εr |answer(µ)|) (with high probability) adaptively and automatically If µ P [pµn ´ errn, pµn + errn] (with high probability), then the optimal and successful choice is {answer = ans´ max(εa, εr |ans+|) + ans+ max(εa, εr |ans´|) max(εa, εr |ans+|) + max(εa, εr |ans´|) where ans˘ := " sup inf * µ P [pµn ´ errn, pµn + errn] answer(µ) provided |ans+ ´ ans´| max(εa, εr |ans+|) + max(εa, εr |ans´|) ď 1 (H. et al., 2017+) 2/10
  • 5. Problem IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Numerical Examples References Berry-Esseen Stopping Rule for IID Monte Carlo µ = ż Rd f(x) ν(dx) pµn = 1 n nÿ i=1 f(xi), xi IID „ ν Need µ P [^µn ´ errn, ^µn + errn] with high probability P[|µ ´ ^µn| ď errn] « 99% for Φ ´ ? n errn /(1.2^σ) = 0.005 by the Central Limit Theorem where ^σ2 is the sample variation 3/10
  • 6. Problem IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Numerical Examples References Berry-Esseen Stopping Rule for IID Monte Carlo µ = ż Rd f(x) ν(dx) pµn = 1 n nÿ i=1 f(xi), xi IID „ ν Need µ P [^µn ´ errn, ^µn + errn] with high probability P[|µ ´ ^µn| ď errn] ě 99% for Φ ´ ? n errn /(1.2^σnσ ) + ∆n(´ ? n errn /(1.2^σnσ ), κmax) = 0.0025 by the Berry-Esseen Inequality where ^σ2 nσ is the sample variation using an independent sample, and provided that kurt(f(X)) ď κmax(nσ) (H. et al., 2013; Jiang, 2016) 3/10
  • 7. Problem IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Numerical Examples References Adaptive Low Discrepancy Sampling Cubature µ = ż [0,1]d f(x) dx pµn = 1 n nÿ i=1 f(xi), xi Sobol’ or lattice Need µ P [^µn ´ errn, ^µn + errn] 4/10
  • 8. Problem IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Numerical Examples References Adaptive Low Discrepancy Sampling Cubature µ = ż [0,1]d f(x) dx pµn = 1 n nÿ i=1 f(xi), xi Sobol’ or lattice Need µ P [^µn ´ errn, ^µn + errn] 4/10
  • 9. Problem IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Numerical Examples References Adaptive Low Discrepancy Sampling Cubature µ = ż [0,1]d f(x) dx pµn = 1 n nÿ i=1 f(xi), xi Sobol’ or lattice Need µ P [^µn ´ errn, ^µn + errn] Express µ ´ ^µn in terms of the Fourier coefficients of f. Assuming that these coefficients do not decay erratically, the discrete transform, rfn,κ (n´1 κ=0 , may be used to bound the error reliably (H. and Jiménez Rugama, 2016; Jiménez Rugama and H., 2016; H. et al., 2017+): |µ ´ ^µn| ď errn := C(n, ) 2 ´1ÿ κ=2 ´1 rfn,κ 4/10
  • 10. Problem IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Numerical Examples References Bayesian Cubature—f Is Random µ = ż Rd f(x) ν(dx) pµn = nÿ i=1 wi f(xi) Need µ P [^µn ´ errn, ^µn + errn] with high probability Assume f „ GP(0, C). Choose the wi to integrate the best estimate of f given the data txi, f(xi)un i=1 (Diaconis, 1988; O’Hagan, 1991; Ritter, 2000; Rasmussen and Ghahramani, 2003) P[|µ ´ ^µn| ď errn] = 99% for errn = an expression involving C and txi, f(xi)un i=1 A de-randomized interpretation exists (H., 2017+) 5/10
  • 11. Problem IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Numerical Examples References Gaussian Probability µ = ż [a,b] exp ´1 2 tT Σ´1 t a (2π)d det(Σ) dt Genz (1993) = ż [0,1]d´1 f(x) dx For some typical choice of a, b, Σ, d = 3, εa = 0; µ « 0.6763 Worst 10% Worst 10% εr Method % Accuracy n Time (s) IID Monte Carlo 100% 8.1E4 1.8E´2 1E´2 Sobol’ Sampling 100% 1.0E3 5.1E´3 Bayesian Lattice 100% 1.0E3 2.8E´3 IID Monte Carlo 100% 2.0E6 3.8E´1 1E´3 Sobol’ Sampling 100% 2.0E3 7.7E´3 Bayesian Lattice 100% 1.0E3 2.8E´3 1E´4 Sobol’ Sampling 100% 1.6E4 1.8E´2 Bayesian Lattice 100% 8.2E3 1.4E´2 6/10
  • 12. Problem IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Numerical Examples References Sobol’ Indices Y = output(X), where X „ U[0, 1]d ; Sobol’ Indexj(µ) describes how much coordinate j of input X influences output Y (Sobol’, 1990; 2001): Sobol’ Indexj(µ) := µ1 µ2 ´ µ2 3 , j = 1, . . . , d µ1 := ż [0,1)2d [output(x) ´ output(xj : x1 ´j)]output(x1 ) dx dx1 µ2 := ż [0,1)d output(x)2 dx, µ3 := ż [0,1)d output(x) dx. output(x) = ´x1 + x1x2 ´ x1x2x3 + ¨ ¨ ¨ + x1x2x3x4x5x6 (Bratley et al., 1992) εa = 1E´3, εr = 0 j 1 2 3 4 5 6 n 65 536 32 768 16 384 16 384 2 048 2 048 Sobol’ Indexj 0.6529 0.1791 0.0370 0.0133 0.0015 0.0015 {Sobol’ Indexj 0.6528 0.1792 0.0363 0.0126 0.0010 0.0012 Sobol’ Indexj(pµn) 0.6492 0.1758 0.0308 0.0083 0.0018 0.0039 7/10
  • 13. Thank you Slides available at www.slideshare.net/fjhickernell/siam-cse-2017-talk
  • 14. Problem IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Numerical Examples References References I Bratley, P., B. L. Fox, and H. Niederreiter. 1992. Implementation and tests of low-discrepancy sequences, ACM Trans. Model. Comput. Simul. 2, 195–213. Choi, S.-C. T., Y. Ding, F. J. H., L. Jiang, Ll. A. Jiménez Rugama, X. Tong, Y. Zhang, and X. Zhou. 2013–2015. GAIL: Guaranteed Automatic Integration Library (versions 1.0–2.1). Cools, R. and D. Nuyens (eds.) 2016. Monte Carlo and quasi-Monte Carlo methods: MCQMC, Leuven, Belgium, April 2014, Springer Proceedings in Mathematics and Statistics, vol. 163, Springer-Verlag, Berlin. Diaconis, P. 1988. Bayesian numerical analysis, Statistical decision theory and related topics IV, Papers from the 4th Purdue symp., West Lafayette, Indiana 1986, pp. 163–175. Genz, A. 1993. Comparison of methods for the computation of multivariate normal probabilities, Computing Science and Statistics 25, 400–405. H., F. J. 2017+. Error analysis of quasi-Monte Carlo methods. submitted for publication, arXiv:1702.01487. H., F. J., L. Jiang, Y. Liu, and A. B. Owen. 2013. Guaranteed conservative fixed width confidence intervals via Monte Carlo sampling, Monte Carlo and quasi-Monte Carlo methods 2012, pp. 105–128. H., F. J. and Ll. A. Jiménez Rugama. 2016. Reliable adaptive cubature using digital sequences, Monte Carlo and quasi-Monte Carlo methods: MCQMC, Leuven, Belgium, April 2014, pp. 367–383. arXiv:1410.8615 [math.NA]. 9/10
  • 15. Problem IID Monte Carlo Low Discrepancy Sampling Bayesian Cubature Numerical Examples References References II H., F. J., Ll. A. Jiménez Rugama, and D. Li. 2017+. Adaptive quasi-Monte Carlo methods. submitted for publication, arXiv:1702.01491 [math.NA]. Jiang, L. 2016. Guaranteed adaptive Monte Carlo methods for estimating means of random variables, Ph.D. Thesis. Jiménez Rugama, Ll. A. and F. J. H. 2016. Adaptive multidimensional integration based on rank-1 lattices, Monte Carlo and quasi-Monte Carlo methods: MCQMC, Leuven, Belgium, April 2014, pp. 407–422. arXiv:1411.1966. O’Hagan, A. 1991. Bayes-Hermite quadrature, J. Statist. Plann. Inference 29, 245–260. Rasmussen, C. E. and Z. Ghahramani. 2003. Bayesian Monte Carlo, Advances in Neural Information Processing Systems, pp. 489–496. Ritter, K. 2000. Average-case analysis of numerical problems, Lecture Notes in Mathematics, vol. 1733, Springer-Verlag, Berlin. Sobol’, I. M. 1990. On sensitivity estimation for nonlinear mathematical models, Matem. Mod. 2, no. 1, 112–118. . 2001. Global sensitivity indices for nonlinear mathematical models and their monte carlo estimates, Math. Comput. Simul. 55, no. 1-3, 271–280. 10/10