SlideShare a Scribd company logo
Quantifying Nonparametric
Modeling Uncertainty with BART
Ed George
Wharton, University of Pennsylvania
Collaborations with:
Hugh Chipman (Acadia), Rob McCulloch (Arizona State)
Matt Pratola (Ohio State), Tom Shively (UT Austin)
The SAMSI Program on Model Uncertainty:
Mathematical and Statistical (MUMS)
August 20, 2018
1 / 62
Part I. BART (Bayesian Additive Regression Trees)
Data: n observations of y and x = (x1, ..., xp)
Suppose: Y = f (x) + , symmetric with mean 0
Bayesian Ensemble Idea: Approximate unknown f (x) by a form
f (x) = g(x; θ1) + g(x; θ2) + ... + g(x; θm)
θ1, θ2, . . . , θm iid ∼ π(θ)
and use the posterior of f given y for inference.
BART is obtained when each g(x; θj ) is a regression tree.
Key calibration: Using y, set π(θ) so that Var(f ) ≈ Var(y).
2 / 62
Beginning with a Single Regression Tree Model
3 / 62
Bayesian CART: Just add a prior π(M, T)
Bayesian CART Model Search
(Chipman, George, McCulloch 1998)
π(M, T) = π(M | T)π(T)
π(M | T) : (µ1, µ2, . . . , µb) ∼ Nb(0, τ2I)
π(T): Stochastic process to generate tree skeleton plus uniform
prior on splitting variables and splitting rules.
Closed form for π(T | y) facilitates MCMC stochastic search for
promising trees.
(See also Denison, Mallick and Smith 1998)
4 / 62
Moving on to BART
Bayesian Additive Regression Trees
(Chipman, George, McCulloch 2010)
The BART ensemble model
Y = g(x; T1, M1)+g(x; T2, M2)+. . .+g(x; Tm, Mm)+σz, z ∼ N(0, 1)
Each (Ti , Mi ) identifies a single tree.
E(Y | x, T1, M1, . . . , Tm, Mm) is the sum of m bottom node µ’s,
one from each tree.
Number of trees m can be much larger than sample size n.
g(x; T1, M1), g(x; T2, M2), ..., g(x; Tm, Mm) is a highly redundant
“over-complete basis” with many many parameters.
5 / 62
Complete the Model with a Regularization Prior
π((T1, M1), (T2, M2), . . . , (Tm, Mm), σ)
π applies the Bayesian CART prior to each (Tj , Mj ) independently
so that:
Each T small.
Each µ small.
The observed variation of y is used to calibrate the choice of the
hyperparameters for the µ and σ priors.
π is a “regularization prior” as it keeps the contribution of each
g(x; Ti , Mi ) small, explaining only a small portion of the fit.
6 / 62
Build up the fit, by adding up tiny bits of fit ..
7 / 62
Connections to Other Modeling Ideas
Bayesian Nonparametrics:
Lots of parameters (to make model flexible)
A strong prior to shrink towards simple structure (regularization)
BART shrinks towards additive models with some interaction
Dynamic Random Basis:
g(x;T1,M1), ..., g(x;Tm,Mm) are dimensionally adaptive
Gradient Boosting:
Overall fit becomes the cumulative effort of many “weak learners”
Connections to Other Modeling Ideas
Y = g(x;T1,M1) + ... + g(x;Tm,Mm) + & z
plus
#((T1,M1),....(Tm,Mm),&)
12
Bayesian Nonparametrics:
Lots of parameters (to make model flexible)
A strong prior to shrink towards simple structure (regularization)
BART shrinks towards additive models with some interaction
Dynamic Random Basis Elements:
g(x; T1, M1), ..., g(x; Tm, Mm) are each dimensionally adaptive
Gradient Boosting:
Fit becomes the cumulative effort of many weak learners
8 / 62
Some Distinguishing Features of BART
Bayesian Nonparametrics:
Lots of parameters (to make model flexible)
A strong prior to shrink towards simple structure (regularization)
BART shrinks towards additive models with some interaction
Dynamic Random Basis:
g(x;T1,M1), ..., g(x;Tm,Mm) are dimensionally adaptive
Gradient Boosting:
Overall fit becomes the cumulative effort of many “weak learners”
Connections to Other Modeling Ideas
Y = g(x;T1,M1) + ... + g(x;Tm,Mm) + & z
plus
#((T1,M1),....(Tm,Mm),&)
12
BART is NOT Bayesian model averaging of a single tree model
Unlike boosting and random forests, the BART algorithm updates
a fixed set of m trees, over and over
Choose m large for best estimation of E[Y |x] and prediction
More trees yields more approximation flexibility
Choose m small for variable selection
Fewer trees force the x’s to compete for entry
9 / 62
A Sketch of the BART MCMC Algorithm
Bayesian Nonparametrics:
Lots of parameters (to make model flexible)
A strong prior to shrink towards simple structure (regularization)
BART shrinks towards additive models with some interaction
Dynamic Random Basis:
g(x;T1,M1), ..., g(x;Tm,Mm) are dimensionally adaptive
Gradient Boosting:
Overall fit becomes the cumulative effort of many “weak learners”
Connections to Other Modeling Ideas
Y = g(x;T1,M1) + ... + g(x;Tm,Mm) + & z
plus
#((T1,M1),....(Tm,Mm),&)
12
Outer Loop is a “simple” Gibbs sampler:
Iteratively sample each (Ti , Mi ) given Y , σ and all other
(Tj , Mj )’s (Bayesian Backfitting)
Sample σ given Y and (T1, M1, . . . , . . . , Tm, Mm)
To sample (Ti , Mi ) above,
1. Subtract the contributions of all the other trees from both
sides to get a simple one-tree model update.
2. Integrate out M to sample T and then sample M | T.
10 / 62
For the draw of T we use a Metropolis-Hastings within Gibbs step.
Our proposal moves around tree space by proposing local
modifications such as the “birth-death” step:Because p(T | data) is available in closed form (up to a norming constant),
we use a Metropolis-Hastings algorithm.
Our proposal moves around tree space by proposing local modifications
such as
=>
?
=>
?
propose a more complex tree
propose a simpler tree
Such modifications are accepted according to their compatibility
with p(T | data).
20
Simulating p(T | data) with the Bayesian CART Algorithm
... as the MCMC runs, each tree in the sum will grow and shrink,
swapping fit amongst them ....
11 / 62
Using the MCMC Output to Draw Inference
After convergence (which happens surprisingly quickly), each
iteration d results in a draw from the posterior of f
ˆfd (·) = g(·; T1d , M1d ) + · · · + g(·; Tmd , Mmd )
To estimate f (x) we simply average the ˆfd (·) draws at x
Posterior uncertainty is captured by variation of the ˆfd (x)
eg, 95% HPD region estimated by middle 95% of values
Can do the same with functionals of f .
12 / 62
Out of Sample Prediction
Predictive comparisons on 42 data sets.
Data from Kim, Loh, Shih and Chaudhuri (2006) (thanks Wei-Yin Loh!)
p = 3 to 65, n = 100 to 7,000.
for each data set 20 random splits into 5/6 train and 1/6 test
use 5-fold cross-validation on train to pick hyperparameters (except
BART-default!)
gives 20*42 = 840 out-of-sample predictions, for each prediction, divide rmse
of different methods by the smallest
+ each boxplots represents
840 predictions for a
method
+ 1.2 means you are 20%
worse than the best
+ BART-cv best
+ BART-default (use default
prior) does amazingly
well!!
RondomForestsNeuralNetBoostingBART−cvBART−default
1.0 1.1 1.2 1.3 1.4 1.5
13 / 62
Automatic Uncertainty Quantification
A simple simulated 1-dimensional example
−1.0 −0.5 0.0 0.5 1.0
−1.0−0.50.00.51.0
x
y
95% pointwise posterior intervals, BART
posterior mean
true f
−1.0 −0.5 0.0 0.5 1.0
−1.0−0.50.00.51.0
x
y
95% pointwise posterior intervals, mBART
Note: MBART on the right plot still to be discussed
14 / 62
Uncertainty Quantification of Marginal Predictor Effects
Partial Dependence Plot of Crime Effect in Boston Housing Data
Estimates are ˆf (xi ) = (1/n) i
ˆf (xi , x−i )
15 / 62
Example: Friedman’s Simulated Data
Y = f (x) + , ∼ N(0, 1)
where
f (x) = 10 sin(πx1x2) + 20(x3 − .5)2 + 10x4 + 5x5 + 0x6 + · · · + 0x10
xi ’s iid ∼ Uniform(0, 1)
Only the first 5 xi ’s matter!
Friedman (1991) used n = 100 observations from this model
to illustrate the potential of MARS
BART handily outperforms competitors including random
forests, neural nets and gradient boosting on this example.
16 / 62
Applying BART to the Friedman Data
With n = 100 observations and m = 100 trees
17 / 62
Detecting Low Dimensional Structure in High Dimensional Data
18 / 62
Measuring Variable Importance in BART
19 / 62
Creating a Competitive Bottleneck
20 / 62
Variable Selection via BART
21 / 62
Part II. MBART - Monotone BART
Multidimensional Monotone BART
(Chipman, George, McCulloch, Shively 2018)
Idea:
Approximate multivariate monotone functions by the sum of many
single tree models, each of which is monotonic.
22 / 62
An Example of a Monotonic Tree
x1
x2
f(x)
Three different views of
a bivariate monotonic
tree.
23 / 62
In what sense is this single tree monotonic?
x1
x2
f(x)
A tree function g is said to be monotonic in xi if for any δ > 0,
g(x1, x2, . . . , xi + δ, xi+1, . . . , xk; T, M)
≥ g(x1, x2, . . . , xi , xi+1, . . . , xk; T, M).
For simplicity and wlog, let’s restrict attention to monotone
nondecreasing functions.
24 / 62
To implement this monotonicity in “tree language” we simply
constrain the mean level of a node to be greater than those of it
below neighbors and less than those of its above neighbors.
0.2 0.4 0.6 0.8
0.20.40.60.8
x1
x2
4
10
11
12
13
7
node 7 is disjoint from node 4.
node 10 is a below neighbor of node 13.
node 7 is an above neighbor of node 13.
The mean level of node 13 must be greater than those of 10 and
12 and less than that of node 7.
25 / 62
The MBART Prior
Recall the BART parameter
θ = ((T1, M1), (T2, M2), . . . , (Tm, Mm), σ)
Let S = {θ : every tree is monotonic in a designated subset of xi s}
To impose the monotonicity we simply truncate the BART prior
π(θ) to the set S
π∗
(θ) ∝ π(θ) IS (θ)
where IS (θ) is 1 if every tree in θ is montonic.
26 / 62
A New BART MCMC “Christmas Tree” Algorithm
π((T1, M1), (T2, M2), . . . , (Tm, Mm), σ | y)
Bayesian backfitting again: Iteratively sample each (Tj , Mj ) given
(y, σ) and other (Tj , Mj )’s
Each (T0, M0) → (T1, M1) update is sampled as follows:
Denote move as
(T0, M0
Common, M0
Old ) → (T1, M0
Common, M1
New )
Propose T∗ via birth, death, etc.
If M-H with π(T, M | y) accepts (T∗, M0
Common)
Set (T1
, M1
Common) = (T∗
, M0
Common)
Sample M1
New from π(MNew | T1
, M1
Common, y)
Only M0
Old → M1
New needs to be updated.
Works for both BART and MBART.
27 / 62
Example: Product of two x’s
Let’s consider a very simple simulated monotone example:
Y = x1 x2 + , xi ∼ Uniform(0, 1).
Here is the plot of the true function f (x1, x2) = x1 x2
0.2 0.4 0.6 0.8 1.0
0.20.40.60.81.0
x1
x2
x1
x2
f(x)
28 / 62
First we try a single (just one tree), unconstrained tree model.
Here is the graph of the fit.
0.2 0.4 0.6 0.8
0.20.40.60.8
x1
x2
x1
x2
f(x)
The fit is not terrible, but there are some aspects of the fit which
violate monotonicity.
29 / 62
Here is the graph of the fit with the monotone constraint:
0.2 0.4 0.6 0.8
0.20.40.60.8
x1
x2
x1
x2
f(x)
We see that our fit is monotonic, and more representative of the
true f .
30 / 62
Here is the unconstrained BART fit:
0.2 0.4 0.6 0.8
0.20.40.60.8
x1
x2
x1
x2
f(x)
Much better (of course) but not monotone!
31 / 62
And, finally, the constrained MBART fit:
0.2 0.4 0.6 0.8
0.20.40.60.8
x1
x2
x1
x2
f(x)
Not Bad!
Same method works with any number of x’s!
32 / 62
A 5-Dimensional Example
Y = x1 x2
2 + x3 x3
4 + x5 + ,
∼ N(0, σ2
), xi ∼ Uniform(0, 1).
For various choices of σ, we simulated 5,000 observations.
33 / 62
RMSE improvement over unconstrained BART
qqq
q
q
q
q
qq
q
qq
q
qq
q
q
bart−1 mbart−1 bart−2 mbart−2 bart−3 mbart−3 bart−4 mbart−4
0.100.150.200.250.300.35
σ = 0.2, 0.5, 0.7, 1.0
34 / 62
Discovering Monotonicity with MBART
Suppose we are not sure if a function is monotone.
Good news! MBART can be deployed to estimate the monotone
components of f .
Thus monotonicity can be discovered rather than imposed!
Example: Suppose Y = x3 + .
q
q
q
q
qq
q
q
q q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
qq
qq
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
qqqq
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q qq
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
qq
q
q
qq
q
q
q
q
q
q
q
−1.0 −0.5 0.0 0.5 1.0
−1.0−0.50.00.5
x
y
BART and mBART, y on x
BART
mBART
q
q
q
q
qq
q
q
q q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
qq
qq
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
qqqq
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
qq
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q qq
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
qq
q
q
qq
q
q
q
q
q
q
q
−1.0 −0.5 0.0 0.5 1.0
−1.0−0.50.00.5
x
y
mBART, y on (x up, x down)
mBART: monotone up fit
mBART: monotone down fit
mBART: overall fit
35 / 62
Example: Suppose Y = x2 + .
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
qq
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
qq
q
q
qq
q
q
q
q
q
q
q
−1.0 −0.5 0.0 0.5 1.0
0.00.20.40.60.81.0
x
y
BART and mBART, y on x
BART
mBART
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
qq
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qqq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
q
qq
q
q
q
q
q
qq
q
q
q
q
qq
q
q
q
qq
q
q
qq
q
q
q
q
q
q
q
−1.0 −0.5 0.0 0.5 1.0
0.00.20.40.60.81.0
xy
mBART, y on (x up, x down)
mBART: monotone up fit
mBART: monotone down fit
mBART: overall fit
36 / 62
Example: Suppose Y = sin(x) + .
qq
qqqq
qqqq
qqqq
q
q
qq
q
q
q
q
qq
qqq
qq
qqq
qq
q
q
qq
q
qq
q
q
q
qqq
q
q
q
q
q
q
qqqqqq
qq
q
qqq
qq
q
q
q
q
q
q
qq
q
qqqqqqqqq
q
qqqqq
q
q
qqqq
qqqqq
q
q
q
q
qqqq
qq
qqqq
q
qqq
q
q
q
qqq
qq
qq
q
q
q
qqq
q
q
q
q
qqq
q
q
q
qq
q
q
qqqqqqqqqq
qq
q
q
q
qq
qqq
q
q
qqq
q
q
q
q
q
q
qq
q
q
q
q
q
qqqq
q
qqq
qq
q
q
qq
qqq
q
qqq
qq
q
qqq
qq
q
q
q
qq
q
q
q
qq
qq
qqqq
q
qqqqq
q
qq
q
q
q
qq
q
q
q
q
q
qqq
q
q
q
qq
q
q
qqqqq
q
q
qq
q
qq
q
qqq
qqq
q
qqqqqq
q
qq
q
qqqq
q
qqqq
qqqq
qq
q
qqq
qq
q
qqq
q
qq
qqqqq
qqqq
q
qq
q
q
q
q
q
qqqq
qqq
qq
qq
q
q
q
qqq
q
q
q
qq
qq
q
qq
qq
q
q
qq
qqq
qqqqq
qqqq
qq
q
q
qqq
q
qqq
q
q
q
q
q
q
qqqq
q
q
qqqqqqq
q
q
q
q
qq
q
qqq
q
qq
q
qqq
qq
q
qqqq
qqq
qqqqq
q
q
q
q
q
q
q
qq
qqqqq
q
q
qqq
qqqqqqqq
q
q
qq
qq
q
q
qqqq
q
qqq
qqqq
q
qq
q
q
q
q
qq
−10 −5 0 5 10
−3−2−10123
x
y
BART and mBART, y on x
BART
mBART
qq
qqqq
qqqq
qqqq
q
q
qq
q
q
q
q
qq
qqq
qq
qqq
qq
q
q
qq
q
qq
q
q
q
qqq
q
q
q
q
q
q
qqqqqq
qq
q
qqq
qq
q
q
q
q
q
q
qq
q
qqqqqqqqq
q
qqqqq
q
q
qqqq
qqqqq
q
q
q
q
qqqq
qq
qqqq
q
qqq
q
q
q
qqq
qq
qq
q
q
q
qqq
q
q
q
q
qqq
q
q
q
qq
q
q
qqqqqqqqqq
qq
q
q
q
qq
qqq
q
q
qqq
q
q
q
q
q
q
qq
q
q
q
q
q
qqqq
q
qqq
qq
q
q
qq
qqq
q
qqq
qq
q
qqq
qq
q
q
q
qq
q
q
q
qq
qq
qqqq
q
qqqqq
q
qq
q
q
q
qq
q
q
q
q
q
qqq
q
q
q
qq
q
q
qqqqq
q
q
qq
q
qq
q
qqq
qqq
q
qqqqqq
q
qq
q
qqqq
q
qqqq
qqqq
qq
q
qqq
qq
q
qqq
q
qq
qqqqq
qqqq
q
qq
q
q
q
q
q
qqqq
qqq
qq
qq
q
q
q
qqq
q
q
q
qq
qq
q
qq
qq
q
q
qq
qqq
qqqqq
qqqq
qq
q
q
qqq
q
qqq
q
q
q
q
q
q
qqqq
q
q
qqqqqqq
q
q
q
q
qq
q
qqq
q
qq
q
qqq
qq
q
qqqq
qqq
qqqqq
q
q
q
q
q
q
q
qq
qqqqq
q
q
qqq
qqqqqqqq
q
q
qq
qq
q
q
qqqq
q
qqq
qqqq
q
qq
q
q
q
q
qq
−10 −5 0 5 10
−3−2−10123
xy
mBART, y on (x up, x down)
mBART: monotone up fit
mBART: monotone down fit
mBART: overall fit
37 / 62
Part III. HBART - Heteroscedastic BART
Heteroscedastic BART via Multiplicative Regression Trees
(Pratola, Chipman, George, McCulloch 2018)
BART flexibly fits the conditional mean with a sum of trees.
HBART flexibly fits both the conditional mean with a sum of
trees and the conditional variance with a product of trees.
38 / 62
The HBART model:
Y = f (x) + s(x) Z, Z ∼ N(0, 1)
f (x) =
m
i=1
g(x; Ti , Mi )
s2
(x) =
m
i=1
h(x; Ti , Vi )
Each (Ti , Mi ) identifies a tree model for a mean component.
Each (Ti , Vi ) identifies a tree model for a variance component.
39 / 62
The Mean Components Prior for HBART
Key to BART is the simple prior on the bottom node µ parameters.
After centering Y at zero, we set them to be iid with
µ ∼ N(0, τ2
).
f (x) =
m
i=1
µi
so that,
f (x) ∼ N(0, m τ2
).
This simplifies calibration of τ2 with the observed variance of Y .
40 / 62
The Variance Components Prior for HBART
For the Vi (bottom node variance components of Ti ), we use:
σ2
∼
νλ
χ2
ν
, iid.
Then
s(x) =
i
σi
This prior is not as simple as the µ prior but by a simple
moment-matching strategy, we have a good heuristic for the choice
of ν and λ.
And, we use the same priors for T and T .
And, the simplicity of the HBART MCMC is maintained!!
41 / 62
The HBART MCMC Algorithm
With an enhanced Metropolis proposal, the HBART MCMC
Bayesian Backfitting algorithm seems to converge quickly!
Top: draws of
σ in BART.
Midddle:
draws of s(xi )
for 5 i in
HBART.
Bottom:
draws of
average s(xi )
in HBART.
42 / 62
At each MCMC iteration we have draws of all the
(Ti , Mi ), i = 1, 2, . . . , m
and
(Ti , Vi ), i = 1, 2, . . . , m
At MCMC iteration d we have a draw fd of the function f
and a draw s2
d of the function s2.
So, for example, at any x, we could use
ˆf (x) =
1
D
D
d=1
fd (x), ˆs2
(x) =
1
D
D
d=1
s2
d (x)
43 / 62
A Simple 1-Dimensional Simulated Example
ˆf (x) and ˆf (x) ± 2 ˆs(x)
44 / 62
Pointwise intervals.
Inference for f . Inference for s.
0.0 0.2 0.4 0.6 0.8 1.0
01234
x
f(x)
true mean
estimated mean
95% interval
0.0 0.2 0.4 0.6 0.8 1.0
0.51.01.5
x
s(x)
true standev
estimated standev
95% interval
45 / 62
The previous displays used the fact that x is one-dimensional.
But the next two displays can be used with a vector x of any
dimension.
46 / 62
Given {xi }, sort by ˆs(xi ) then plot 95% quantile intervals for s(xi )
vs ˆs(xi ).
The horizontal line is the ordinary BART estimate of σ.
This “H-evidence” plot is useful for gauging the extent of
heteroscedasticity.
47 / 62
Given training or test (xi , yi ):
for each (fd , sd ) draw, let ˜yid = fd (xi ) + sd (xi )zd , z standard
normal.
for each i compute the quantile of yi in the draws ˜yid .
If the model is right, the quantiles should look like draws from the
uniform. This can be gauged with qq-plots.
HBART BART
48 / 62
For numeric responses, we typically check out-of-sample
predictions using RMSE.
However, that just checks the point prediction.
Our Bayesian model give us a full predictive distribution for
Y | x
With the qq-plots, we assess the full distributional fit, rather than
just the point prediction.
49 / 62
Used Car Prices Example
Real example, with 15 predictor variables.
y is the price of a used car
x is characteristics of the car: mileage, year, trim, color, etc.
So we are “nonparametrically” estimating both the mean and the
variance of the price, each as a function of 15 variables.
50 / 62
The H-evidence plot reveals substantial heteroscedasticity
51 / 62
The qq-plots show a clear improvement over BART
Not perfect, but pretty good !!.
52 / 62
Hyperparameter Selection via e-distance Cross Validation
k is a key prior hyperparameter which determines the smoothness
of the function f estimate.
Rather than using RMSE cross-validation for choosing k, we use
the e-distance measure between the empirical quantiles and the
uniform quantiles.
Each boxplot tells us how good the qq-plot looks on a bunch of
randomly chosen test data sets.
BART HBART
53 / 62
Variable selection for both f (x) and s(x):
f at left.
s at right.
s(x) uses trim.other in addition to mileage and year.
54 / 62
Fish and Alcohol Examples
Fish
The dependent variable y is the daily catch of fishing boats in the
Grand Bank fishing grounds (Fernandez et al., 2002).
The 25 explanatory x variables capture time, location, and
characteristics of the boat.
55 / 62
The H-evidence plot reveals substantial heteroscedasticity
5000 10000 15000
050001000015000
s^(x)
s(x)posterior
56 / 62
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
0.0 0.2 0.4 0.6 0.8 1.0
0.00.20.40.60.81.0
sample quantile
uniform
predictive qqplots, heteroskedastic model
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
0.0 0.2 0.4 0.6 0.8 1.0
0.00.20.40.60.81.0
sample quantile
uniform
predictive qqplots, homoskedastic model
Even though we know y ≥ 0, simple HBART is not too bad!!
57 / 62
Alcohol
The dependent variable y is the number of alcoholic beverages
consumed in the last two weeks. (Kenkel and Terza, 2001).
The 35 explanatory x variables capture demographic and physical
characteristics of the respondents as well as a key treatment
variable indicating receipt of advice from a physician.
58 / 62
The H-evidence plot does not show heteroscedasticity
1.1 1.2 1.3 1.4 1.5 1.6
1.01.21.41.61.8
s^(x)
s(x)posterior
59 / 62
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
0.0 0.2 0.4 0.6 0.8 1.0
0.00.20.40.60.81.0
sample quantile
uniform
predictive qqplots, heteroskedastic model
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
0.0 0.2 0.4 0.6 0.8 1.0
0.00.20.40.60.81.0
sample quantile
uniform
predictive qqplots, homoskedastic model
Even though we know y ≥ 0, ordinary BART is not too bad!!
60 / 62
Concluding Remarks
Despite its many compelling successes in practice, theoretical
frequentist support for BART is only now just beginning to
appear.
In particular, Rockova and van der Pas (2017) Posterior
Concentration for Bayesian Regression Trees and Their
Ensembles recently obtained the first theoretical results for
Bayesian CART and BART, showing near-minimax posterior
concentration when p > n for classes of Holder continuous
functions.
Software for BART is on CRAN, for MBART at
https://bitbucket.org/remcc/mbart (use git clone to get),
and for HBART coming soon as rBART on CRAN.
61 / 62
Thank You!
62 / 62

More Related Content

PDF
MUMS: Bayesian, Fiducial, and Frequentist Conference - Multidimensional Monot...
PDF
Understanding Random Forests: From Theory to Practice
PDF
Understanding variable importances in forests of randomized trees
PDF
Considerate Approaches to ABC Model Selection
PDF
better together? statistical learning in models made of modules
PDF
Bias-variance decomposition in Random Forests
PDF
gnm: a Package for Generalized Nonlinear Models
PDF
NBBC15, Reyjavik, June 08, 2015
MUMS: Bayesian, Fiducial, and Frequentist Conference - Multidimensional Monot...
Understanding Random Forests: From Theory to Practice
Understanding variable importances in forests of randomized trees
Considerate Approaches to ABC Model Selection
better together? statistical learning in models made of modules
Bias-variance decomposition in Random Forests
gnm: a Package for Generalized Nonlinear Models
NBBC15, Reyjavik, June 08, 2015

What's hot (20)

PDF
from model uncertainty to ABC
PDF
MUMS Opening Workshop - An Overview of Reduced-Order Models and Emulators (ED...
PDF
Pattern learning and recognition on statistical manifolds: An information-geo...
PDF
Intractable likelihoods
PDF
Big model, big data
PDF
On the vexing dilemma of hypothesis testing and the predicted demise of the B...
PDF
Inference for stochastic differential equations via approximate Bayesian comp...
PPT
Input analysis
PDF
ABC with data cloning for MLE in state space models
PDF
random forests for ABC model choice and parameter estimation
PDF
Reweighting and Boosting to uniforimty in HEP
PDF
Accelerated approximate Bayesian computation with applications to protein fol...
PDF
From L to N: Nonlinear Predictors in Generalized Models
PDF
Intro to Approximate Bayesian Computation (ABC)
PDF
Multiplicative Interaction Models in R
PDF
Approximating Bayes Factors
PDF
Nber slides11 lecture2
PDF
On Clustering Financial Time Series - Beyond Correlation
PDF
Statistics symposium talk, Harvard University
PDF
My data are incomplete and noisy: Information-reduction statistical methods f...
from model uncertainty to ABC
MUMS Opening Workshop - An Overview of Reduced-Order Models and Emulators (ED...
Pattern learning and recognition on statistical manifolds: An information-geo...
Intractable likelihoods
Big model, big data
On the vexing dilemma of hypothesis testing and the predicted demise of the B...
Inference for stochastic differential equations via approximate Bayesian comp...
Input analysis
ABC with data cloning for MLE in state space models
random forests for ABC model choice and parameter estimation
Reweighting and Boosting to uniforimty in HEP
Accelerated approximate Bayesian computation with applications to protein fol...
From L to N: Nonlinear Predictors in Generalized Models
Intro to Approximate Bayesian Computation (ABC)
Multiplicative Interaction Models in R
Approximating Bayes Factors
Nber slides11 lecture2
On Clustering Financial Time Series - Beyond Correlation
Statistics symposium talk, Harvard University
My data are incomplete and noisy: Information-reduction statistical methods f...
Ad

Similar to MUMS Opening Workshop - Quantifying Nonparametric Modeling Uncertainty with BART - Ed George, August 20, 2018 (20)

PDF
MSSISS riBART 20160321
PPTX
18 Simple CART
PDF
XGBoostLSS - An extension of XGBoost to probabilistic forecasting, Alexander ...
PDF
An Importance Sampling Approach to Integrate Expert Knowledge When Learning B...
PPTX
Decision Tree - C4.5&CART
PPTX
Machine learning basics using trees algorithm (Random forest, Gradient Boosting)
PDF
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
PPT
Advanced cart 2007
PDF
An Experimental Study about Simple Decision Trees for Bagging Ensemble on Dat...
PPTX
CART – Classification & Regression Trees
PDF
MUMS: Bayesian, Fiducial, and Frequentist Conference - Multiscale Analysis of...
PPTX
CART Training 1999
PDF
Kernel Bayes Rule
PDF
Cheatsheet supervised-learning
PDF
Slides econ-lm
PPTX
14 Simple CART
PDF
Markov Blanket Causal Discovery Using Minimum Message Length
PPT
Tree net and_randomforests_2009
PDF
Random Forest for Big Data
PPTX
5.Module_AIML Random Forest.pptx
MSSISS riBART 20160321
18 Simple CART
XGBoostLSS - An extension of XGBoost to probabilistic forecasting, Alexander ...
An Importance Sampling Approach to Integrate Expert Knowledge When Learning B...
Decision Tree - C4.5&CART
Machine learning basics using trees algorithm (Random forest, Gradient Boosting)
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
Advanced cart 2007
An Experimental Study about Simple Decision Trees for Bagging Ensemble on Dat...
CART – Classification & Regression Trees
MUMS: Bayesian, Fiducial, and Frequentist Conference - Multiscale Analysis of...
CART Training 1999
Kernel Bayes Rule
Cheatsheet supervised-learning
Slides econ-lm
14 Simple CART
Markov Blanket Causal Discovery Using Minimum Message Length
Tree net and_randomforests_2009
Random Forest for Big Data
5.Module_AIML Random Forest.pptx
Ad

More from The Statistical and Applied Mathematical Sciences Institute (20)

PDF
Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...
PDF
2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...
PDF
Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...
PDF
Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...
PDF
Causal Inference Opening Workshop - A Bracketing Relationship between Differe...
PDF
Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...
PPTX
Causal Inference Opening Workshop - Difference-in-differences: more than meet...
PDF
Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...
PDF
Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...
PPTX
Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...
PDF
Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...
PDF
Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...
PDF
Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...
PDF
Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...
PDF
Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...
PDF
Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...
PPTX
2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...
PPTX
2019 Fall Series: Professional Development, Writing Academic Papers…What Work...
PDF
2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...
PDF
2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...
Causal Inference Opening Workshop - Latent Variable Models, Causal Inference,...
2019 Fall Series: Special Guest Lecture - 0-1 Phase Transitions in High Dimen...
Causal Inference Opening Workshop - Causal Discovery in Neuroimaging Data - F...
Causal Inference Opening Workshop - Smooth Extensions to BART for Heterogeneo...
Causal Inference Opening Workshop - A Bracketing Relationship between Differe...
Causal Inference Opening Workshop - Testing Weak Nulls in Matched Observation...
Causal Inference Opening Workshop - Difference-in-differences: more than meet...
Causal Inference Opening Workshop - New Statistical Learning Methods for Esti...
Causal Inference Opening Workshop - Bipartite Causal Inference with Interfere...
Causal Inference Opening Workshop - Bridging the Gap Between Causal Literatur...
Causal Inference Opening Workshop - Some Applications of Reinforcement Learni...
Causal Inference Opening Workshop - Bracketing Bounds for Differences-in-Diff...
Causal Inference Opening Workshop - Assisting the Impact of State Polcies: Br...
Causal Inference Opening Workshop - Experimenting in Equilibrium - Stefan Wag...
Causal Inference Opening Workshop - Targeted Learning for Causal Inference Ba...
Causal Inference Opening Workshop - Bayesian Nonparametric Models for Treatme...
2019 Fall Series: Special Guest Lecture - Adversarial Risk Analysis of the Ge...
2019 Fall Series: Professional Development, Writing Academic Papers…What Work...
2019 GDRR: Blockchain Data Analytics - Machine Learning in/for Blockchain: Fu...
2019 GDRR: Blockchain Data Analytics - QuTrack: Model Life Cycle Management f...

Recently uploaded (20)

PDF
O5-L3 Freight Transport Ops (International) V1.pdf
DOC
Soft-furnishing-By-Architect-A.F.M.Mohiuddin-Akhand.doc
PDF
Supply Chain Operations Speaking Notes -ICLT Program
PDF
STATICS OF THE RIGID BODIES Hibbelers.pdf
PPTX
Lesson notes of climatology university.
PDF
RTP_AR_KS1_Tutor's Guide_English [FOR REPRODUCTION].pdf
PDF
Chinmaya Tiranga quiz Grand Finale.pdf
PPTX
Final Presentation General Medicine 03-08-2024.pptx
PPTX
Orientation - ARALprogram of Deped to the Parents.pptx
PDF
O7-L3 Supply Chain Operations - ICLT Program
PDF
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
PPTX
school management -TNTEU- B.Ed., Semester II Unit 1.pptx
PDF
Module 4: Burden of Disease Tutorial Slides S2 2025
PPTX
Cell Structure & Organelles in detailed.
PPTX
1st Inaugural Professorial Lecture held on 19th February 2020 (Governance and...
PDF
FourierSeries-QuestionsWithAnswers(Part-A).pdf
PPTX
Pharma ospi slides which help in ospi learning
PPTX
PPT- ENG7_QUARTER1_LESSON1_WEEK1. IMAGERY -DESCRIPTIONS pptx.pptx
PDF
Weekly quiz Compilation Jan -July 25.pdf
PDF
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS
O5-L3 Freight Transport Ops (International) V1.pdf
Soft-furnishing-By-Architect-A.F.M.Mohiuddin-Akhand.doc
Supply Chain Operations Speaking Notes -ICLT Program
STATICS OF THE RIGID BODIES Hibbelers.pdf
Lesson notes of climatology university.
RTP_AR_KS1_Tutor's Guide_English [FOR REPRODUCTION].pdf
Chinmaya Tiranga quiz Grand Finale.pdf
Final Presentation General Medicine 03-08-2024.pptx
Orientation - ARALprogram of Deped to the Parents.pptx
O7-L3 Supply Chain Operations - ICLT Program
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
school management -TNTEU- B.Ed., Semester II Unit 1.pptx
Module 4: Burden of Disease Tutorial Slides S2 2025
Cell Structure & Organelles in detailed.
1st Inaugural Professorial Lecture held on 19th February 2020 (Governance and...
FourierSeries-QuestionsWithAnswers(Part-A).pdf
Pharma ospi slides which help in ospi learning
PPT- ENG7_QUARTER1_LESSON1_WEEK1. IMAGERY -DESCRIPTIONS pptx.pptx
Weekly quiz Compilation Jan -July 25.pdf
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS

MUMS Opening Workshop - Quantifying Nonparametric Modeling Uncertainty with BART - Ed George, August 20, 2018