SlideShare a Scribd company logo
Strong convergence of probabilistic solvers
for deterministic ordinary differential equations
(joint work with A. M. Stuart, T. J. Sullivan)
Han Cheng Lie
SAMSI QMC Transition Workshop
Research Triangle Park, NC
08.05.2018
Support by Excellence Initiative of German Research Foundation (DFG),
National Science Foundation grant DMS-1127914, SAMSI QMC
Introduction
Applied mathematics:
Computable approximations for continuous problems
Replacing continuous phenomena by finite sets of values
Introduction
Applied mathematics:
Computable approximations for continuous problems
Replacing continuous phenomena by finite sets of values
The QMC Program: Placing points in space
Strategies: random (Monte Carlo), deterministic (quasi-Monte Carlo),
random-deterministic
Introduction
Applied mathematics:
Computable approximations for continuous problems
Replacing continuous phenomena by finite sets of values
The QMC Program: Placing points in space
Strategies: random (Monte Carlo), deterministic (quasi-Monte Carlo),
random-deterministic
Classical numerical analysis for deterministic ODEs
Points belong to infinite-dimensional space of trajectories
Quantity of interest: Dirac measure supported at true trajectory
Introduction
Applied mathematics:
Computable approximations for continuous problems
Replacing continuous phenomena by finite sets of values
The QMC Program: Placing points in space
Strategies: random (Monte Carlo), deterministic (quasi-Monte Carlo),
random-deterministic
Classical numerical analysis for deterministic ODEs
Points belong to infinite-dimensional space of trajectories
Quantity of interest: Dirac measure supported at true trajectory
Construct trajectories by interpolating between point evaluations
Main tasks: 1) Recursively construct sequences of point evalutions
Main tasks: 2) Quantify error in trajectory placement
Introduction
This project:
Use Monte Carlo to determine point evaluations
Approximate Dirac measure with nondegenerate probability measure
Establish contraction to Dirac measure at true trajectory
Further development of convergence analysis of Conrad et al.,
“Statistical analysis of differential equations: introducing probability
measures on numerical solutions”, Stat. Comput. (2017)
Solving ordinary differential equations
Initial value problem (IVP) for 0 ≤ t ≤ T with solution u = (u(t))0≤t≤T
d
dt u(t) = f(u(t)), u(0) = u0 ∈ Rd
Solving ordinary differential equations
Initial value problem (IVP) for 0 ≤ t ≤ T with solution u = (u(t))0≤t≤T
d
dt u(t) = f(u(t)), u(0) = u0 ∈ Rd
Flow map semigroup (Fh
)h≥0, Fh
: Rd
→ Rd
satisfies
Fh
(u0) = u(h), F0
(u0) = u0
sequence of observations uk = (Fh
)k
(u0) = u(kh)
Solving ordinary differential equations
Initial value problem (IVP) for 0 ≤ t ≤ T with solution u = (u(t))0≤t≤T
d
dt u(t) = f(u(t)), u(0) = u0 ∈ Rd
Flow map semigroup (Fh
)h≥0, Fh
: Rd
→ Rd
satisfies
Fh
(u0) = u(h), F0
(u0) = u0
sequence of observations uk = (Fh
)k
(u0) = u(kh)
Represent deterministic solver with time step h > 0 by Ψh
: Rd
→ Rd
[Ψh
]k
(u0) ≈ (Fh
)k
(u0) = uk , for u0 ∈ Rd
, k ∈ N0
Common task in classical numerical analysis: prove
supv∈Rd Ψh
(v) − Fh
(v) ≤ Chq+1
⇒ Asymptotically, worst-case error over trajectories decreases with h
at rate q
Motivation for probabilistic approach
Available information about solution values off time grid (kh)k comes
from on-grid values (Ψkh
(u0))k
Need for uncertainty quantification (UQ) in off-grid values
Motivation for probabilistic approach
Available information about solution values off time grid (kh)k comes
from on-grid values (Ψkh
(u0))k
Need for uncertainty quantification (UQ) in off-grid values
In some cases, more solves on coarse grids yield more statistical
information than fewer solves on fine grids
→ inverse problems, data assimilation, multilevel Monte Carlo
Motivation for probabilistic approach
Available information about solution values off time grid (kh)k comes
from on-grid values (Ψkh
(u0))k
Need for uncertainty quantification (UQ) in off-grid values
In some cases, more solves on coarse grids yield more statistical
information than fewer solves on fine grids
→ inverse problems, data assimilation, multilevel Monte Carlo
In other cases, quantity of interest (QoI) is functional of solution of IVP
→ Can use estimates of off-mesh uncertainty due to Ψh
to estimate
uncertainty in QoI
Probabilistic solvers of Conrad et al. (2017)
Fix probability space (Ω, P), time step h > 0
Using deterministic solver Ψh
, generate (Uk )k with U0 = u0 by
Uk+1 := Ψh
(Uk ) + ξk (h), ξk (h) =
h
0
χk (s)ds
Probabilistic solvers of Conrad et al. (2017)
Fix probability space (Ω, P), time step h > 0
Using deterministic solver Ψh
, generate (Uk )k with U0 = u0 by
Uk+1 := Ψh
(Uk ) + ξk (h), ξk (h) =
h
0
χk (s)ds
♠ Stochastic process χk on [0, h] models off-grid behaviour
(Use Monte Carlo to determine point evaluations)
Probabilistic solvers of Conrad et al. (2017)
Fix probability space (Ω, P), time step h > 0
Using deterministic solver Ψh
, generate (Uk )k with U0 = u0 by
Uk+1 := Ψh
(Uk ) + ξk (h), ξk (h) =
h
0
χk (s)ds
♠ Stochastic process χk on [0, h] models off-grid behaviour
(Use Monte Carlo to determine point evaluations)
Solution sequences: (u(kh))k (true solution)
([Ψh
]k
(u0))k (deterministic approximation)
(Uk )k (probabilistic approximation)
Two challenges, and their relevance
1) obtain probabilistic analogues of worst-case bounds over
trajectories, i.e. of
maxk uk − (Ψh
)k
(u0) ≤ Chq
♠ Sometimes, worst-case bound less useful than “most likely”-case
bound (e.g. weather prediction)
Two challenges, and their relevance
1) obtain probabilistic analogues of worst-case bounds over
trajectories, i.e. of
maxk uk − (Ψh
)k
(u0) ≤ Chq
♠ Sometimes, worst-case bound less useful than “most likely”-case
bound (e.g. weather prediction)
2) Describe conditions on (ξk )k so that P ◦ (Uk )−1
k provides good
approximation of Dirac measure at true solution
♠ Improving random approximations may be cheaper and more
effective than improving deterministic approximation
Strong convergence result of Conrad et. al.
Theorem
Assume: mean-zero, i.i.d. Gaussian noise (ξk (h))k
E[|ξk (h)|
2
] ≤ Cξ(h2
)p+1/2
, p ≥ 1
globally Lipschitz vector field f, and Ψh
has local order q + 1, i.e.
supv∈Rn Ψh
(v) − Fh
(v) ≤ CΨhq+1
.
Then for some C > 0,
max
k
E |uk − Uk |
2
≤ Ch2 min{q,p}
.
Strong convergence result of Conrad et. al.
Theorem
Assume: mean-zero, i.i.d. Gaussian noise (ξk (h))k
E[|ξk (h)|
2
] ≤ Cξ(h2
)p+1/2
, p ≥ 1
globally Lipschitz vector field f, and Ψh
has local order q + 1, i.e.
supv∈Rn Ψh
(v) − Fh
(v) ≤ CΨhq+1
.
Then for some C > 0,
max
k
E |uk − Uk |
2
≤ Ch2 min{q,p}
.
Compare with deterministic solver:
maxk uk − (Ψh
)k
(u0)
2
≤ Ch2q
♠ q = p ⇒ probabilistic solver has same convergence rate
⇒ max. amount of solution uncertainty consistent with order of Ψh
Recent results
Theorem (Stuart, Sullivan)
Assume: mean-zero, mutually independent noise (ξk (h))k such that
E[|ξk (h)|
2
] ≤ Cξ(h2
)p+1/2
, p ≥ 1,
globally Lipschitz flow map Fh
, numerical solver Ψh
has local order
q + 1 and satisfies
X ∈ L2
P ⇒ Ψh
(X) ∈ L2
P.
Then for some C > 0,
E max
k
|uk − Uk |
2
≤ Ch2 min{q,p}
.
Recent results
Theorem (Stuart, Sullivan)
Assume: mean-zero, mutually independent noise (ξk (h))k such that
E[|ξk (h)|
2
] ≤ Cξ(h2
)p+1/2
, p ≥ 1,
globally Lipschitz flow map Fh
, numerical solver Ψh
has local order
q + 1 and satisfies
X ∈ L2
P ⇒ Ψh
(X) ∈ L2
P.
Then for some C > 0,
E max
k
|uk − Uk |
2
≤ Ch2 min{q,p}
.
♠ No Gaussianity assumption
♠ weaker Lipschitz condition (on flow map instead of vector field)
Recent results
Theorem (Stuart, Sullivan)
Assume: mean-zero, mutually independent noise (ξk (h))k such that
E[|ξk (h)|
2
] ≤ Cξ(h2
)p+1/2
, p ≥ 1,
globally Lipschitz flow map Fh
, numerical solver Ψh
has local order
q + 1 and satisfies
X ∈ L2
P ⇒ Ψh
(X) ∈ L2
P.
Then for some C > 0,
E max
k
|uk − Uk |
2
≤ Ch2 min{q,p}
.
♠ No Gaussianity assumption
♠ weaker Lipschitz condition (on flow map instead of vector field)
♠ stronger convergence, since supremum is inside expectation
Proof strategy: Use moment condition on Ψh
, zero mean,
independence to construct martingale
Apply martingale inequality, Lipschitz property of Fh
, uniform error
bound on Ψh
to bound
Ej := max
k≤j
uj − Uj
2
Take expectations, apply Gronwall’s lemma to sequence (E[Ej ])j
Proof strategy: Use moment condition on Ψh
, zero mean,
independence to construct martingale
Apply martingale inequality, Lipschitz property of Fh
, uniform error
bound on Ψh
to bound
Ej := max
k≤j
uj − Uj
2
Take expectations, apply Gronwall’s lemma to sequence (E[Ej ])j
Questions:
- Can we remove mean-zero and independence assumptions?
- Can we consider noise with fewer moments (heavier tails)?
Recent results
Uniform moment bound assumption: For some p, Cξ ≥ 1, R ∈ N,
E |ξk (h)|
r
≤ Cξhp+1/2 r
, 1 ≤ r ≤ R
Recent results
Uniform moment bound assumption: For some p, Cξ ≥ 1, R ∈ N,
E |ξk (h)|
r
≤ Cξhp+1/2 r
, 1 ≤ r ≤ R
♠ No mean-zero or i.i.d. assumptions; fewer moments if R = 1
Recent results
Uniform moment bound assumption: For some p, Cξ ≥ 1, R ∈ N,
E |ξk (h)|
r
≤ Cξhp+1/2 r
, 1 ≤ r ≤ R
♠ No mean-zero or i.i.d. assumptions; fewer moments if R = 1
Theorem (L.)
Assume that the uniform moment bound holds, that Ψh
has order
q + 1, and Fh
is globally Lipschitz. Then for some C > 0,
E max
k
|uk − Uk |
R
≤ C(hR
)min{q,p−1/2}
Recent results
Uniform moment bound assumption: For some p, Cξ ≥ 1, R ∈ N,
E |ξk (h)|
r
≤ Cξhp+1/2 r
, 1 ≤ r ≤ R
♠ No mean-zero or i.i.d. assumptions; fewer moments if R = 1
Theorem (L.)
Assume that the uniform moment bound holds, that Ψh
has order
q + 1, and Fh
is globally Lipschitz. Then for some C > 0,
E max
k
|uk − Uk |
R
≤ C(hR
)min{q,p−1/2}
♠ No assumption that X ∈ L2
P ⇒ Ψh
(X) ∈ L2
P
Recent results
Uniform moment bound assumption: For some p, Cξ ≥ 1, R ∈ N,
E |ξk (h)|
r
≤ Cξhp+1/2 r
, 1 ≤ r ≤ R
♠ No mean-zero or i.i.d. assumptions; fewer moments if R = 1
Theorem (L.)
Assume that the uniform moment bound holds, that Ψh
has order
q + 1, and Fh
is globally Lipschitz. Then for some C > 0,
E max
k
|uk − Uk |
R
≤ C(hR
)min{q,p−1/2}
♠ No assumption that X ∈ L2
P ⇒ Ψh
(X) ∈ L2
P
♣ p = q + 1
2 max. uncertainty consistent with order of Ψh
(tradeoff: zero mean + independence vs. faster moment decay)
Significance for uncertainty quantification
♠ No independence assumption allows for state-dependent noise ξk
Uk+1 := Ψh
(Uk ) + ξk (h, (Uj )j≤k )
Significance for uncertainty quantification
♠ No independence assumption allows for state-dependent noise ξk
Uk+1 := Ψh
(Uk ) + ξk (h, (Uj )j≤k )
Bayesian approach: choose where to place next evaluation points,
based on which evaluation points have already been chosen
Significance for uncertainty quantification
♠ No independence assumption allows for state-dependent noise ξk
Uk+1 := Ψh
(Uk ) + ξk (h, (Uj )j≤k )
Bayesian approach: choose where to place next evaluation points,
based on which evaluation points have already been chosen
♠ Better use of regularity of noise than result of Conrad et al.
(convergence in LR
P topology, for ξk (h) ∈ LR
P )
Significance for uncertainty quantification
♠ No independence assumption allows for state-dependent noise ξk
Uk+1 := Ψh
(Uk ) + ξk (h, (Uj )j≤k )
Bayesian approach: choose where to place next evaluation points,
based on which evaluation points have already been chosen
♠ Better use of regularity of noise than result of Conrad et al.
(convergence in LR
P topology, for ξk (h) ∈ LR
P )
Consequence: if p ≥ q + 1
2 , R ≥ 3, then
P max
k
|uk − Uk | ≥ h ≤ C(hR
)q
< C(h2
)q
Nondegenerate prob. measure P ◦ (Uk )−1
k improves approximation
of Dirac measure at true solution sequence (uk )k more rapidly
Comparison with classical numerical analysis
Assume: uniform moment bound condition, for some R ∈ N.
Comparison with classical numerical analysis
Assume: uniform moment bound condition, for some R ∈ N.
Decay rates of error incurred by one step:
(deterministic) sup
v∈Rn
Ψh
(v) − Fh
(v)
R
hR(q+1)
(probabilistic) E[|ξk (h)|
R
] (hR
)p+1/2
Recall: p = q + 1
2 max. uncertainty consistent with order of Ψh
Comparison with classical numerical analysis
Assume: uniform moment bound condition, for some R ∈ N.
Decay rates of error incurred by one step:
(deterministic) sup
v∈Rn
Ψh
(v) − Fh
(v)
R
hR(q+1)
(probabilistic) E[|ξk (h)|
R
] (hR
)p+1/2
Recall: p = q + 1
2 max. uncertainty consistent with order of Ψh
♠ Relative to deterministic solvers, probabilistic solver with uniform
moment bound has ‘same performance’
Comparison with classical numerical analysis
Assume: uniform moment bound condition, for some R ∈ N.
Decay rates of error incurred by one step:
(deterministic) sup
v∈Rn
Ψh
(v) − Fh
(v)
R
hR(q+1)
(probabilistic) E[|ξk (h)|
R
] (hR
)p+1/2
Recall: p = q + 1
2 max. uncertainty consistent with order of Ψh
♠ Relative to deterministic solvers, probabilistic solver with uniform
moment bound has ‘same performance’
Cost: constants depend on additive noise model (ξk )k
Gain: solver for UQ more flexible than worst-case analysis
Related results:
Corollary (Exponential integrability): If uniform moment bound
condition holds for R = +∞, then
E exp ρ max
k
|uk − Uk | < ∞, ∀ρ ∈ R.
♠ Similar techniques yield same result for maxk |Uk |
Related results:
Corollary (Exponential integrability): If uniform moment bound
condition holds for R = +∞, then
E exp ρ max
k
|uk − Uk | < ∞, ∀ρ ∈ R.
♠ Similar techniques yield same result for maxk |Uk |
Corollary (Convergence in continuous time) If for some p, R, Cξ ≥ 1
E sup
0≤t≤h
|ξk (t)|
R
≤ Cξhp+1/2
R
,
flow map Fh
is globally Lipschitz, Ψh
has order q + 1, then
E sup
0≤t≤T
|u(t) − U(t)|
R
≤ C hR
min{q,p−1/2}
.
Related results:
Corollary (Exponential integrability): If uniform moment bound
condition holds for R = +∞, then
E exp ρ max
k
|uk − Uk | < ∞, ∀ρ ∈ R.
♠ Similar techniques yield same result for maxk |Uk |
Corollary (Convergence in continuous time) If for some p, R, Cξ ≥ 1
E sup
0≤t≤h
|ξk (t)|
R
≤ Cξhp+1/2
R
,
flow map Fh
is globally Lipschitz, Ψh
has order q + 1, then
E sup
0≤t≤T
|u(t) − U(t)|
R
≤ C hR
min{q,p−1/2}
.
Analogous results for locally Lipschitz vector fields and flow maps,
implicit Euler Ψh
Summary
QMC Program: placing points in space
This project: Use Monte Carlo to place points in ∞-dim. space
Motivation: uncertainty quantification in off-grid values
Summary
QMC Program: placing points in space
This project: Use Monte Carlo to place points in ∞-dim. space
Motivation: uncertainty quantification in off-grid values
We provide theoretical support for probabilistic integrators, by
Showing that they can have same performance as deterministic
solvers (modulo constants)
Summary
QMC Program: placing points in space
This project: Use Monte Carlo to place points in ∞-dim. space
Motivation: uncertainty quantification in off-grid values
We provide theoretical support for probabilistic integrators, by
Showing that they can have same performance as deterministic
solvers (modulo constants)
Leveraging more information in the noise get better
approximations, but faster
Summary
QMC Program: placing points in space
This project: Use Monte Carlo to place points in ∞-dim. space
Motivation: uncertainty quantification in off-grid values
We provide theoretical support for probabilistic integrators, by
Showing that they can have same performance as deterministic
solvers (modulo constants)
Leveraging more information in the noise get better
approximations, but faster
Removing mean-zero and independence assumptions many
more possibilities for placement of evaluation points
Acknowledgements
Ilse Ipsen for supporting this programme
SAMSI staff for efficient administration: Thomas Gehrmann, Rebecca
Gunn, Sue McDonald, Karem Jackson, and others
Chris Oates, Tim Sullivan, and members of WG Probabilistic
Numerics
Some literature on probabilistic methods for evolution equations
[Caveat: Not a complete list]
Abdulle and Garegnani, “Random time step probabilistic methods for uncertainty quantification in chaotic and
geometric numerical integration”, arXiv:1801.01340
Chkrebtii, Campbell, Calderhead, and Girolami, “Bayesian solution uncertainty quantification for differential
equations”, Bayesian Analysis (2016)
Cockayne, Oates, Sullivan, and Girolami, “Probabilistic numerical methods for PDE-constrained Bayesian inverse
problems”, 36th Int. Workshop on Bayesian Inference and Maximum Entropy Methods, AIP Conf. Proc. (2016)
Conrad, Girolami, S¨arkka, Stuart, and Zygalakis, “Statistical analysis of differential equations: introducing probability
measures on numerical solutions”, Stat. Comput. (2017)
Kersting and Hennig, “Active uncertainty calibration in Bayesian ODE solvers”, Proc. Conf. Uncertainty in A.I. (2016)
Lie, Stuart and Sullivan, “Strong convergence rates of probabilistic solvers for ordinary differential equations”,
arXiv:1703.03680
Lie, Sullivan and Teckentrup, “Random forward models and log-likelihoods in Bayesian inverse problems”,
arXiv:1712.05717
Magnani, Kersting, Schober, and Hennig, “Bayesian filtering for ODEs with bounded derivatives”, arXiv:1709.08471
Schober, S¨arkka, and Hennig, “A probabilistic model for the numerical solution of initial value problems”, Stat.
Comput. (2017)
Schober, Duvenaud, and Hennig, “Probabilistic ODE solvers with Runge-Kutta means”, NIPS (2014)
Teymur, Zygalakis, and Calderhead, “Probabilistic Linear Multistep Methods”, NIPS (2016)

More Related Content

PDF
Athens workshop on MCMC
PDF
Coordinate sampler: A non-reversible Gibbs-like sampler
PDF
ABC-Xian
PDF
Some Thoughts on Sampling
PDF
Control of Uncertain Nonlinear Systems Using Ellipsoidal Reachability Calculus
PDF
Locality-sensitive hashing for search in metric space
PDF
no U-turn sampler, a discussion of Hoffman & Gelman NUTS algorithm
PDF
Unbiased Hamiltonian Monte Carlo
Athens workshop on MCMC
Coordinate sampler: A non-reversible Gibbs-like sampler
ABC-Xian
Some Thoughts on Sampling
Control of Uncertain Nonlinear Systems Using Ellipsoidal Reachability Calculus
Locality-sensitive hashing for search in metric space
no U-turn sampler, a discussion of Hoffman & Gelman NUTS algorithm
Unbiased Hamiltonian Monte Carlo

What's hot (20)

PDF
Representation formula for traffic flow estimation on a network
PDF
ABC in Roma
PDF
Hamilton-Jacobi equations and Lax-Hopf formulae for traffic flow modeling
PDF
Numerical approach for Hamilton-Jacobi equations on a network: application to...
PDF
14th Athens Colloquium on Algorithms and Complexity (ACAC19)
PDF
Poster for Bayesian Statistics in the Big Data Era conference
PDF
Zap Q-Learning - ISMP 2018
PDF
Sparse-Bayesian Approach to Inverse Problems with Partial Differential Equati...
PDF
Some recent developments in the traffic flow variational formulation
PDF
On Clustering Histograms with k-Means by Using Mixed α-Divergences
PDF
Bayesian adaptive optimal estimation using a sieve prior
PDF
Discussion of Fearnhead and Prangle, RSS&lt; Dec. 14, 2011
PDF
Recursive Compressed Sensing
PDF
The gaussian minimum entropy conjecture
PDF
Numerical approach for Hamilton-Jacobi equations on a network: application to...
PDF
Computer Controlled Systems (solutions manual). Astrom. 3rd edition 1997
PDF
Mark Girolami's Read Paper 2010
PPT
Entropic characteristics of quantum channels and the additivity problem
PDF
The dual geometry of Shannon information
PDF
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
Representation formula for traffic flow estimation on a network
ABC in Roma
Hamilton-Jacobi equations and Lax-Hopf formulae for traffic flow modeling
Numerical approach for Hamilton-Jacobi equations on a network: application to...
14th Athens Colloquium on Algorithms and Complexity (ACAC19)
Poster for Bayesian Statistics in the Big Data Era conference
Zap Q-Learning - ISMP 2018
Sparse-Bayesian Approach to Inverse Problems with Partial Differential Equati...
Some recent developments in the traffic flow variational formulation
On Clustering Histograms with k-Means by Using Mixed α-Divergences
Bayesian adaptive optimal estimation using a sieve prior
Discussion of Fearnhead and Prangle, RSS&lt; Dec. 14, 2011
Recursive Compressed Sensing
The gaussian minimum entropy conjecture
Numerical approach for Hamilton-Jacobi equations on a network: application to...
Computer Controlled Systems (solutions manual). Astrom. 3rd edition 1997
Mark Girolami's Read Paper 2010
Entropic characteristics of quantum channels and the additivity problem
The dual geometry of Shannon information
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
Ad

Similar to QMC: Transition Workshop - Probabilistic Integrators for Deterministic Differential Equations - Han Lie, May 8, 2018 (20)

PDF
Hierarchical matrices for approximating large covariance matries and computin...
PDF
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
PDF
Probabilistic Control of Uncertain Linear Systems Using Stochastic Reachability
PDF
Delayed acceptance for Metropolis-Hastings algorithms
PDF
Distributed solution of stochastic optimal control problem on GPUs
PDF
Trilinear embedding for divergence-form operators
PDF
cswiercz-general-presentation
PDF
Diffusion Schrödinger bridges for score-based generative modeling
PDF
Probabilistic Control of Switched Linear Systems with Chance Constraints
PDF
Diffusion Schrödinger bridges for score-based generative modeling
PDF
Subgradient Methods for Huge-Scale Optimization Problems - Юрий Нестеров, Cat...
PDF
Robust Control of Uncertain Switched Linear Systems based on Stochastic Reach...
PDF
CS-ChapterI.pdf
PDF
Control of Discrete-Time Piecewise Affine Probabilistic Systems using Reachab...
PDF
Geometric and viscosity solutions for the Cauchy problem of first order
PDF
QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...
PDF
QMC: Transition Workshop - Density Estimation by Randomized Quasi-Monte Carlo...
PDF
Non-sampling functional approximation of linear and non-linear Bayesian Update
PDF
Unbiased Markov chain Monte Carlo
PDF
Minimum mean square error estimation and approximation of the Bayesian update
Hierarchical matrices for approximating large covariance matries and computin...
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
Probabilistic Control of Uncertain Linear Systems Using Stochastic Reachability
Delayed acceptance for Metropolis-Hastings algorithms
Distributed solution of stochastic optimal control problem on GPUs
Trilinear embedding for divergence-form operators
cswiercz-general-presentation
Diffusion Schrödinger bridges for score-based generative modeling
Probabilistic Control of Switched Linear Systems with Chance Constraints
Diffusion Schrödinger bridges for score-based generative modeling
Subgradient Methods for Huge-Scale Optimization Problems - Юрий Нестеров, Cat...
Robust Control of Uncertain Switched Linear Systems based on Stochastic Reach...
CS-ChapterI.pdf
Control of Discrete-Time Piecewise Affine Probabilistic Systems using Reachab...
Geometric and viscosity solutions for the Cauchy problem of first order
QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...
QMC: Transition Workshop - Density Estimation by Randomized Quasi-Monte Carlo...
Non-sampling functional approximation of linear and non-linear Bayesian Update
Unbiased Markov chain Monte Carlo
Minimum mean square error estimation and approximation of the Bayesian update
Ad

More from The Statistical and Applied Mathematical Sciences Institute (20)

PDF
Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...
PDF
2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...
PDF
Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...
PDF
Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...
PDF
Causal Inference Opening Workshop - A Bracketing Relationship between Differe...
PDF
Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...
PPTX
Causal Inference Opening Workshop - Difference-in-differences: more than meet...
PDF
Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...
PDF
Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...
PPTX
Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...
PDF
Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...
PDF
Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...
PDF
Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...
PDF
Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...
PDF
Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...
PDF
Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...
PPTX
2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...
PPTX
2019 Fall Series: Professional Development, Writing Academic Papers…What Work...
PDF
2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...
PDF
2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...
Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...
2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...
Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...
Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...
Causal Inference Opening Workshop - A Bracketing Relationship between Differe...
Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...
Causal Inference Opening Workshop - Difference-in-differences: more than meet...
Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...
Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...
Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...
Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...
Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...
Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...
Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...
Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...
Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...
2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...
2019 Fall Series: Professional Development, Writing Academic Papers…What Work...
2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...
2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...

Recently uploaded (20)

PDF
3rd Neelam Sanjeevareddy Memorial Lecture.pdf
PPTX
GDM (1) (1).pptx small presentation for students
PPTX
Lesson notes of climatology university.
PDF
FourierSeries-QuestionsWithAnswers(Part-A).pdf
PPTX
PPT- ENG7_QUARTER1_LESSON1_WEEK1. IMAGERY -DESCRIPTIONS pptx.pptx
PPTX
human mycosis Human fungal infections are called human mycosis..pptx
PDF
Microbial disease of the cardiovascular and lymphatic systems
PDF
Insiders guide to clinical Medicine.pdf
PPTX
Pharma ospi slides which help in ospi learning
PDF
102 student loan defaulters named and shamed – Is someone you know on the list?
PPTX
1st Inaugural Professorial Lecture held on 19th February 2020 (Governance and...
PDF
Complications of Minimal Access Surgery at WLH
PDF
Physiotherapy_for_Respiratory_and_Cardiac_Problems WEBBER.pdf
PDF
TR - Agricultural Crops Production NC III.pdf
PPTX
Renaissance Architecture: A Journey from Faith to Humanism
PPTX
IMMUNITY IMMUNITY refers to protection against infection, and the immune syst...
PDF
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
PPTX
Introduction_to_Human_Anatomy_and_Physiology_for_B.Pharm.pptx
PDF
Abdominal Access Techniques with Prof. Dr. R K Mishra
PDF
01-Introduction-to-Information-Management.pdf
3rd Neelam Sanjeevareddy Memorial Lecture.pdf
GDM (1) (1).pptx small presentation for students
Lesson notes of climatology university.
FourierSeries-QuestionsWithAnswers(Part-A).pdf
PPT- ENG7_QUARTER1_LESSON1_WEEK1. IMAGERY -DESCRIPTIONS pptx.pptx
human mycosis Human fungal infections are called human mycosis..pptx
Microbial disease of the cardiovascular and lymphatic systems
Insiders guide to clinical Medicine.pdf
Pharma ospi slides which help in ospi learning
102 student loan defaulters named and shamed – Is someone you know on the list?
1st Inaugural Professorial Lecture held on 19th February 2020 (Governance and...
Complications of Minimal Access Surgery at WLH
Physiotherapy_for_Respiratory_and_Cardiac_Problems WEBBER.pdf
TR - Agricultural Crops Production NC III.pdf
Renaissance Architecture: A Journey from Faith to Humanism
IMMUNITY IMMUNITY refers to protection against infection, and the immune syst...
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
Introduction_to_Human_Anatomy_and_Physiology_for_B.Pharm.pptx
Abdominal Access Techniques with Prof. Dr. R K Mishra
01-Introduction-to-Information-Management.pdf

QMC: Transition Workshop - Probabilistic Integrators for Deterministic Differential Equations - Han Lie, May 8, 2018

  • 1. Strong convergence of probabilistic solvers for deterministic ordinary differential equations (joint work with A. M. Stuart, T. J. Sullivan) Han Cheng Lie SAMSI QMC Transition Workshop Research Triangle Park, NC 08.05.2018 Support by Excellence Initiative of German Research Foundation (DFG), National Science Foundation grant DMS-1127914, SAMSI QMC
  • 2. Introduction Applied mathematics: Computable approximations for continuous problems Replacing continuous phenomena by finite sets of values
  • 3. Introduction Applied mathematics: Computable approximations for continuous problems Replacing continuous phenomena by finite sets of values The QMC Program: Placing points in space Strategies: random (Monte Carlo), deterministic (quasi-Monte Carlo), random-deterministic
  • 4. Introduction Applied mathematics: Computable approximations for continuous problems Replacing continuous phenomena by finite sets of values The QMC Program: Placing points in space Strategies: random (Monte Carlo), deterministic (quasi-Monte Carlo), random-deterministic Classical numerical analysis for deterministic ODEs Points belong to infinite-dimensional space of trajectories Quantity of interest: Dirac measure supported at true trajectory
  • 5. Introduction Applied mathematics: Computable approximations for continuous problems Replacing continuous phenomena by finite sets of values The QMC Program: Placing points in space Strategies: random (Monte Carlo), deterministic (quasi-Monte Carlo), random-deterministic Classical numerical analysis for deterministic ODEs Points belong to infinite-dimensional space of trajectories Quantity of interest: Dirac measure supported at true trajectory Construct trajectories by interpolating between point evaluations Main tasks: 1) Recursively construct sequences of point evalutions Main tasks: 2) Quantify error in trajectory placement
  • 6. Introduction This project: Use Monte Carlo to determine point evaluations Approximate Dirac measure with nondegenerate probability measure Establish contraction to Dirac measure at true trajectory Further development of convergence analysis of Conrad et al., “Statistical analysis of differential equations: introducing probability measures on numerical solutions”, Stat. Comput. (2017)
  • 7. Solving ordinary differential equations Initial value problem (IVP) for 0 ≤ t ≤ T with solution u = (u(t))0≤t≤T d dt u(t) = f(u(t)), u(0) = u0 ∈ Rd
  • 8. Solving ordinary differential equations Initial value problem (IVP) for 0 ≤ t ≤ T with solution u = (u(t))0≤t≤T d dt u(t) = f(u(t)), u(0) = u0 ∈ Rd Flow map semigroup (Fh )h≥0, Fh : Rd → Rd satisfies Fh (u0) = u(h), F0 (u0) = u0 sequence of observations uk = (Fh )k (u0) = u(kh)
  • 9. Solving ordinary differential equations Initial value problem (IVP) for 0 ≤ t ≤ T with solution u = (u(t))0≤t≤T d dt u(t) = f(u(t)), u(0) = u0 ∈ Rd Flow map semigroup (Fh )h≥0, Fh : Rd → Rd satisfies Fh (u0) = u(h), F0 (u0) = u0 sequence of observations uk = (Fh )k (u0) = u(kh) Represent deterministic solver with time step h > 0 by Ψh : Rd → Rd [Ψh ]k (u0) ≈ (Fh )k (u0) = uk , for u0 ∈ Rd , k ∈ N0 Common task in classical numerical analysis: prove supv∈Rd Ψh (v) − Fh (v) ≤ Chq+1 ⇒ Asymptotically, worst-case error over trajectories decreases with h at rate q
  • 10. Motivation for probabilistic approach Available information about solution values off time grid (kh)k comes from on-grid values (Ψkh (u0))k Need for uncertainty quantification (UQ) in off-grid values
  • 11. Motivation for probabilistic approach Available information about solution values off time grid (kh)k comes from on-grid values (Ψkh (u0))k Need for uncertainty quantification (UQ) in off-grid values In some cases, more solves on coarse grids yield more statistical information than fewer solves on fine grids → inverse problems, data assimilation, multilevel Monte Carlo
  • 12. Motivation for probabilistic approach Available information about solution values off time grid (kh)k comes from on-grid values (Ψkh (u0))k Need for uncertainty quantification (UQ) in off-grid values In some cases, more solves on coarse grids yield more statistical information than fewer solves on fine grids → inverse problems, data assimilation, multilevel Monte Carlo In other cases, quantity of interest (QoI) is functional of solution of IVP → Can use estimates of off-mesh uncertainty due to Ψh to estimate uncertainty in QoI
  • 13. Probabilistic solvers of Conrad et al. (2017) Fix probability space (Ω, P), time step h > 0 Using deterministic solver Ψh , generate (Uk )k with U0 = u0 by Uk+1 := Ψh (Uk ) + ξk (h), ξk (h) = h 0 χk (s)ds
  • 14. Probabilistic solvers of Conrad et al. (2017) Fix probability space (Ω, P), time step h > 0 Using deterministic solver Ψh , generate (Uk )k with U0 = u0 by Uk+1 := Ψh (Uk ) + ξk (h), ξk (h) = h 0 χk (s)ds ♠ Stochastic process χk on [0, h] models off-grid behaviour (Use Monte Carlo to determine point evaluations)
  • 15. Probabilistic solvers of Conrad et al. (2017) Fix probability space (Ω, P), time step h > 0 Using deterministic solver Ψh , generate (Uk )k with U0 = u0 by Uk+1 := Ψh (Uk ) + ξk (h), ξk (h) = h 0 χk (s)ds ♠ Stochastic process χk on [0, h] models off-grid behaviour (Use Monte Carlo to determine point evaluations) Solution sequences: (u(kh))k (true solution) ([Ψh ]k (u0))k (deterministic approximation) (Uk )k (probabilistic approximation)
  • 16. Two challenges, and their relevance 1) obtain probabilistic analogues of worst-case bounds over trajectories, i.e. of maxk uk − (Ψh )k (u0) ≤ Chq ♠ Sometimes, worst-case bound less useful than “most likely”-case bound (e.g. weather prediction)
  • 17. Two challenges, and their relevance 1) obtain probabilistic analogues of worst-case bounds over trajectories, i.e. of maxk uk − (Ψh )k (u0) ≤ Chq ♠ Sometimes, worst-case bound less useful than “most likely”-case bound (e.g. weather prediction) 2) Describe conditions on (ξk )k so that P ◦ (Uk )−1 k provides good approximation of Dirac measure at true solution ♠ Improving random approximations may be cheaper and more effective than improving deterministic approximation
  • 18. Strong convergence result of Conrad et. al. Theorem Assume: mean-zero, i.i.d. Gaussian noise (ξk (h))k E[|ξk (h)| 2 ] ≤ Cξ(h2 )p+1/2 , p ≥ 1 globally Lipschitz vector field f, and Ψh has local order q + 1, i.e. supv∈Rn Ψh (v) − Fh (v) ≤ CΨhq+1 . Then for some C > 0, max k E |uk − Uk | 2 ≤ Ch2 min{q,p} .
  • 19. Strong convergence result of Conrad et. al. Theorem Assume: mean-zero, i.i.d. Gaussian noise (ξk (h))k E[|ξk (h)| 2 ] ≤ Cξ(h2 )p+1/2 , p ≥ 1 globally Lipschitz vector field f, and Ψh has local order q + 1, i.e. supv∈Rn Ψh (v) − Fh (v) ≤ CΨhq+1 . Then for some C > 0, max k E |uk − Uk | 2 ≤ Ch2 min{q,p} . Compare with deterministic solver: maxk uk − (Ψh )k (u0) 2 ≤ Ch2q ♠ q = p ⇒ probabilistic solver has same convergence rate ⇒ max. amount of solution uncertainty consistent with order of Ψh
  • 20. Recent results Theorem (Stuart, Sullivan) Assume: mean-zero, mutually independent noise (ξk (h))k such that E[|ξk (h)| 2 ] ≤ Cξ(h2 )p+1/2 , p ≥ 1, globally Lipschitz flow map Fh , numerical solver Ψh has local order q + 1 and satisfies X ∈ L2 P ⇒ Ψh (X) ∈ L2 P. Then for some C > 0, E max k |uk − Uk | 2 ≤ Ch2 min{q,p} .
  • 21. Recent results Theorem (Stuart, Sullivan) Assume: mean-zero, mutually independent noise (ξk (h))k such that E[|ξk (h)| 2 ] ≤ Cξ(h2 )p+1/2 , p ≥ 1, globally Lipschitz flow map Fh , numerical solver Ψh has local order q + 1 and satisfies X ∈ L2 P ⇒ Ψh (X) ∈ L2 P. Then for some C > 0, E max k |uk − Uk | 2 ≤ Ch2 min{q,p} . ♠ No Gaussianity assumption ♠ weaker Lipschitz condition (on flow map instead of vector field)
  • 22. Recent results Theorem (Stuart, Sullivan) Assume: mean-zero, mutually independent noise (ξk (h))k such that E[|ξk (h)| 2 ] ≤ Cξ(h2 )p+1/2 , p ≥ 1, globally Lipschitz flow map Fh , numerical solver Ψh has local order q + 1 and satisfies X ∈ L2 P ⇒ Ψh (X) ∈ L2 P. Then for some C > 0, E max k |uk − Uk | 2 ≤ Ch2 min{q,p} . ♠ No Gaussianity assumption ♠ weaker Lipschitz condition (on flow map instead of vector field) ♠ stronger convergence, since supremum is inside expectation
  • 23. Proof strategy: Use moment condition on Ψh , zero mean, independence to construct martingale Apply martingale inequality, Lipschitz property of Fh , uniform error bound on Ψh to bound Ej := max k≤j uj − Uj 2 Take expectations, apply Gronwall’s lemma to sequence (E[Ej ])j
  • 24. Proof strategy: Use moment condition on Ψh , zero mean, independence to construct martingale Apply martingale inequality, Lipschitz property of Fh , uniform error bound on Ψh to bound Ej := max k≤j uj − Uj 2 Take expectations, apply Gronwall’s lemma to sequence (E[Ej ])j Questions: - Can we remove mean-zero and independence assumptions? - Can we consider noise with fewer moments (heavier tails)?
  • 25. Recent results Uniform moment bound assumption: For some p, Cξ ≥ 1, R ∈ N, E |ξk (h)| r ≤ Cξhp+1/2 r , 1 ≤ r ≤ R
  • 26. Recent results Uniform moment bound assumption: For some p, Cξ ≥ 1, R ∈ N, E |ξk (h)| r ≤ Cξhp+1/2 r , 1 ≤ r ≤ R ♠ No mean-zero or i.i.d. assumptions; fewer moments if R = 1
  • 27. Recent results Uniform moment bound assumption: For some p, Cξ ≥ 1, R ∈ N, E |ξk (h)| r ≤ Cξhp+1/2 r , 1 ≤ r ≤ R ♠ No mean-zero or i.i.d. assumptions; fewer moments if R = 1 Theorem (L.) Assume that the uniform moment bound holds, that Ψh has order q + 1, and Fh is globally Lipschitz. Then for some C > 0, E max k |uk − Uk | R ≤ C(hR )min{q,p−1/2}
  • 28. Recent results Uniform moment bound assumption: For some p, Cξ ≥ 1, R ∈ N, E |ξk (h)| r ≤ Cξhp+1/2 r , 1 ≤ r ≤ R ♠ No mean-zero or i.i.d. assumptions; fewer moments if R = 1 Theorem (L.) Assume that the uniform moment bound holds, that Ψh has order q + 1, and Fh is globally Lipschitz. Then for some C > 0, E max k |uk − Uk | R ≤ C(hR )min{q,p−1/2} ♠ No assumption that X ∈ L2 P ⇒ Ψh (X) ∈ L2 P
  • 29. Recent results Uniform moment bound assumption: For some p, Cξ ≥ 1, R ∈ N, E |ξk (h)| r ≤ Cξhp+1/2 r , 1 ≤ r ≤ R ♠ No mean-zero or i.i.d. assumptions; fewer moments if R = 1 Theorem (L.) Assume that the uniform moment bound holds, that Ψh has order q + 1, and Fh is globally Lipschitz. Then for some C > 0, E max k |uk − Uk | R ≤ C(hR )min{q,p−1/2} ♠ No assumption that X ∈ L2 P ⇒ Ψh (X) ∈ L2 P ♣ p = q + 1 2 max. uncertainty consistent with order of Ψh (tradeoff: zero mean + independence vs. faster moment decay)
  • 30. Significance for uncertainty quantification ♠ No independence assumption allows for state-dependent noise ξk Uk+1 := Ψh (Uk ) + ξk (h, (Uj )j≤k )
  • 31. Significance for uncertainty quantification ♠ No independence assumption allows for state-dependent noise ξk Uk+1 := Ψh (Uk ) + ξk (h, (Uj )j≤k ) Bayesian approach: choose where to place next evaluation points, based on which evaluation points have already been chosen
  • 32. Significance for uncertainty quantification ♠ No independence assumption allows for state-dependent noise ξk Uk+1 := Ψh (Uk ) + ξk (h, (Uj )j≤k ) Bayesian approach: choose where to place next evaluation points, based on which evaluation points have already been chosen ♠ Better use of regularity of noise than result of Conrad et al. (convergence in LR P topology, for ξk (h) ∈ LR P )
  • 33. Significance for uncertainty quantification ♠ No independence assumption allows for state-dependent noise ξk Uk+1 := Ψh (Uk ) + ξk (h, (Uj )j≤k ) Bayesian approach: choose where to place next evaluation points, based on which evaluation points have already been chosen ♠ Better use of regularity of noise than result of Conrad et al. (convergence in LR P topology, for ξk (h) ∈ LR P ) Consequence: if p ≥ q + 1 2 , R ≥ 3, then P max k |uk − Uk | ≥ h ≤ C(hR )q < C(h2 )q Nondegenerate prob. measure P ◦ (Uk )−1 k improves approximation of Dirac measure at true solution sequence (uk )k more rapidly
  • 34. Comparison with classical numerical analysis Assume: uniform moment bound condition, for some R ∈ N.
  • 35. Comparison with classical numerical analysis Assume: uniform moment bound condition, for some R ∈ N. Decay rates of error incurred by one step: (deterministic) sup v∈Rn Ψh (v) − Fh (v) R hR(q+1) (probabilistic) E[|ξk (h)| R ] (hR )p+1/2 Recall: p = q + 1 2 max. uncertainty consistent with order of Ψh
  • 36. Comparison with classical numerical analysis Assume: uniform moment bound condition, for some R ∈ N. Decay rates of error incurred by one step: (deterministic) sup v∈Rn Ψh (v) − Fh (v) R hR(q+1) (probabilistic) E[|ξk (h)| R ] (hR )p+1/2 Recall: p = q + 1 2 max. uncertainty consistent with order of Ψh ♠ Relative to deterministic solvers, probabilistic solver with uniform moment bound has ‘same performance’
  • 37. Comparison with classical numerical analysis Assume: uniform moment bound condition, for some R ∈ N. Decay rates of error incurred by one step: (deterministic) sup v∈Rn Ψh (v) − Fh (v) R hR(q+1) (probabilistic) E[|ξk (h)| R ] (hR )p+1/2 Recall: p = q + 1 2 max. uncertainty consistent with order of Ψh ♠ Relative to deterministic solvers, probabilistic solver with uniform moment bound has ‘same performance’ Cost: constants depend on additive noise model (ξk )k Gain: solver for UQ more flexible than worst-case analysis
  • 38. Related results: Corollary (Exponential integrability): If uniform moment bound condition holds for R = +∞, then E exp ρ max k |uk − Uk | < ∞, ∀ρ ∈ R. ♠ Similar techniques yield same result for maxk |Uk |
  • 39. Related results: Corollary (Exponential integrability): If uniform moment bound condition holds for R = +∞, then E exp ρ max k |uk − Uk | < ∞, ∀ρ ∈ R. ♠ Similar techniques yield same result for maxk |Uk | Corollary (Convergence in continuous time) If for some p, R, Cξ ≥ 1 E sup 0≤t≤h |ξk (t)| R ≤ Cξhp+1/2 R , flow map Fh is globally Lipschitz, Ψh has order q + 1, then E sup 0≤t≤T |u(t) − U(t)| R ≤ C hR min{q,p−1/2} .
  • 40. Related results: Corollary (Exponential integrability): If uniform moment bound condition holds for R = +∞, then E exp ρ max k |uk − Uk | < ∞, ∀ρ ∈ R. ♠ Similar techniques yield same result for maxk |Uk | Corollary (Convergence in continuous time) If for some p, R, Cξ ≥ 1 E sup 0≤t≤h |ξk (t)| R ≤ Cξhp+1/2 R , flow map Fh is globally Lipschitz, Ψh has order q + 1, then E sup 0≤t≤T |u(t) − U(t)| R ≤ C hR min{q,p−1/2} . Analogous results for locally Lipschitz vector fields and flow maps, implicit Euler Ψh
  • 41. Summary QMC Program: placing points in space This project: Use Monte Carlo to place points in ∞-dim. space Motivation: uncertainty quantification in off-grid values
  • 42. Summary QMC Program: placing points in space This project: Use Monte Carlo to place points in ∞-dim. space Motivation: uncertainty quantification in off-grid values We provide theoretical support for probabilistic integrators, by Showing that they can have same performance as deterministic solvers (modulo constants)
  • 43. Summary QMC Program: placing points in space This project: Use Monte Carlo to place points in ∞-dim. space Motivation: uncertainty quantification in off-grid values We provide theoretical support for probabilistic integrators, by Showing that they can have same performance as deterministic solvers (modulo constants) Leveraging more information in the noise get better approximations, but faster
  • 44. Summary QMC Program: placing points in space This project: Use Monte Carlo to place points in ∞-dim. space Motivation: uncertainty quantification in off-grid values We provide theoretical support for probabilistic integrators, by Showing that they can have same performance as deterministic solvers (modulo constants) Leveraging more information in the noise get better approximations, but faster Removing mean-zero and independence assumptions many more possibilities for placement of evaluation points
  • 45. Acknowledgements Ilse Ipsen for supporting this programme SAMSI staff for efficient administration: Thomas Gehrmann, Rebecca Gunn, Sue McDonald, Karem Jackson, and others Chris Oates, Tim Sullivan, and members of WG Probabilistic Numerics
  • 46. Some literature on probabilistic methods for evolution equations [Caveat: Not a complete list] Abdulle and Garegnani, “Random time step probabilistic methods for uncertainty quantification in chaotic and geometric numerical integration”, arXiv:1801.01340 Chkrebtii, Campbell, Calderhead, and Girolami, “Bayesian solution uncertainty quantification for differential equations”, Bayesian Analysis (2016) Cockayne, Oates, Sullivan, and Girolami, “Probabilistic numerical methods for PDE-constrained Bayesian inverse problems”, 36th Int. Workshop on Bayesian Inference and Maximum Entropy Methods, AIP Conf. Proc. (2016) Conrad, Girolami, S¨arkka, Stuart, and Zygalakis, “Statistical analysis of differential equations: introducing probability measures on numerical solutions”, Stat. Comput. (2017) Kersting and Hennig, “Active uncertainty calibration in Bayesian ODE solvers”, Proc. Conf. Uncertainty in A.I. (2016) Lie, Stuart and Sullivan, “Strong convergence rates of probabilistic solvers for ordinary differential equations”, arXiv:1703.03680 Lie, Sullivan and Teckentrup, “Random forward models and log-likelihoods in Bayesian inverse problems”, arXiv:1712.05717 Magnani, Kersting, Schober, and Hennig, “Bayesian filtering for ODEs with bounded derivatives”, arXiv:1709.08471 Schober, S¨arkka, and Hennig, “A probabilistic model for the numerical solution of initial value problems”, Stat. Comput. (2017) Schober, Duvenaud, and Hennig, “Probabilistic ODE solvers with Runge-Kutta means”, NIPS (2014) Teymur, Zygalakis, and Calderhead, “Probabilistic Linear Multistep Methods”, NIPS (2016)