Submit Search
Denoising Diffusion Probabilistic Modelsの重要な式の解説
0 likes
1,153 views
Tomonari Masada
Denoising Diffusion Probabilistic Modelsの重要な式の解説
Engineering
Read more
1 of 14
Download now
Download to read offline
1
2
3
4
5
6
7
8
9
10
11
12
13
14
More Related Content
PPTX
ResNetの仕組み
Kota Nagasato
PDF
[DL輪読会]Recent Advances in Autoencoder-Based Representation Learning
Deep Learning JP
PDF
[DL輪読会]ICLR2020の分布外検知速報
Deep Learning JP
PPTX
You Only Look One-level Featureの解説と見せかけた物体検出のよもやま話
Yusuke Uchida
PDF
Generative Models(メタサーベイ )
cvpaper. challenge
PDF
強化学習の基礎的な考え方と問題の分類
佑 甲野
PPTX
[DL輪読会]Live-Streaming Fraud Detection: A Heterogeneous Graph Neural Network A...
Deep Learning JP
PDF
[DL輪読会]Learning Transferable Visual Models From Natural Language Supervision
Deep Learning JP
ResNetの仕組み
Kota Nagasato
[DL輪読会]Recent Advances in Autoencoder-Based Representation Learning
Deep Learning JP
[DL輪読会]ICLR2020の分布外検知速報
Deep Learning JP
You Only Look One-level Featureの解説と見せかけた物体検出のよもやま話
Yusuke Uchida
Generative Models(メタサーベイ )
cvpaper. challenge
強化学習の基礎的な考え方と問題の分類
佑 甲野
[DL輪読会]Live-Streaming Fraud Detection: A Heterogeneous Graph Neural Network A...
Deep Learning JP
[DL輪読会]Learning Transferable Visual Models From Natural Language Supervision
Deep Learning JP
What's hot
(20)
PPTX
[DL輪読会]World Models
Deep Learning JP
PDF
[DL輪読会]Glow: Generative Flow with Invertible 1×1 Convolutions
Deep Learning JP
PDF
Neural networks for Graph Data NeurIPS2018読み会@PFN
emakryo
PDF
Iclr2016 vaeまとめ
Deep Learning JP
PDF
Transformer 動向調査 in 画像認識(修正版)
Kazuki Maeno
PDF
強化学習その3
nishio
PDF
実装レベルで学ぶVQVAE
ぱんいち すみもと
PPTX
[DL輪読会]Set Transformer: A Framework for Attention-based Permutation-Invariant...
Deep Learning JP
PPTX
XAI (説明可能なAI) の必要性
西岡 賢一郎
PPTX
[DL輪読会]Life-Long Disentangled Representation Learning with Cross-Domain Laten...
Deep Learning JP
PDF
[DL輪読会]Disentangling by Factorising
Deep Learning JP
PDF
自己教師学習(Self-Supervised Learning)
cvpaper. challenge
PPTX
MASTERING ATARI WITH DISCRETE WORLD MODELS (DreamerV2)
harmonylab
PDF
【DL輪読会】DINOv2: Learning Robust Visual Features without Supervision
Deep Learning JP
PPTX
【DL輪読会】DiffRF: Rendering-guided 3D Radiance Field Diffusion [N. Muller+ CVPR2...
Deep Learning JP
PPTX
強化学習アルゴリズムPPOの解説と実験
克海 納谷
PDF
Vision and Language(メタサーベイ )
cvpaper. challenge
PDF
[DL輪読会]Deep Learning 第15章 表現学習
Deep Learning JP
PPTX
SSII2020SS: グラフデータでも深層学習 〜 Graph Neural Networks 入門 〜
SSII
PDF
PCAの最終形態GPLVMの解説
弘毅 露崎
[DL輪読会]World Models
Deep Learning JP
[DL輪読会]Glow: Generative Flow with Invertible 1×1 Convolutions
Deep Learning JP
Neural networks for Graph Data NeurIPS2018読み会@PFN
emakryo
Iclr2016 vaeまとめ
Deep Learning JP
Transformer 動向調査 in 画像認識(修正版)
Kazuki Maeno
強化学習その3
nishio
実装レベルで学ぶVQVAE
ぱんいち すみもと
[DL輪読会]Set Transformer: A Framework for Attention-based Permutation-Invariant...
Deep Learning JP
XAI (説明可能なAI) の必要性
西岡 賢一郎
[DL輪読会]Life-Long Disentangled Representation Learning with Cross-Domain Laten...
Deep Learning JP
[DL輪読会]Disentangling by Factorising
Deep Learning JP
自己教師学習(Self-Supervised Learning)
cvpaper. challenge
MASTERING ATARI WITH DISCRETE WORLD MODELS (DreamerV2)
harmonylab
【DL輪読会】DINOv2: Learning Robust Visual Features without Supervision
Deep Learning JP
【DL輪読会】DiffRF: Rendering-guided 3D Radiance Field Diffusion [N. Muller+ CVPR2...
Deep Learning JP
強化学習アルゴリズムPPOの解説と実験
克海 納谷
Vision and Language(メタサーベイ )
cvpaper. challenge
[DL輪読会]Deep Learning 第15章 表現学習
Deep Learning JP
SSII2020SS: グラフデータでも深層学習 〜 Graph Neural Networks 入門 〜
SSII
PCAの最終形態GPLVMの解説
弘毅 露崎
Ad
More from Tomonari Masada
(20)
PDF
Learning Latent Space Energy Based Prior Modelの解説
Tomonari Masada
PDF
Context-dependent Token-wise Variational Autoencoder for Topic Modeling
Tomonari Masada
PDF
A note on the density of Gumbel-softmax
Tomonari Masada
PPTX
トピックモデルの基礎と応用
Tomonari Masada
PDF
Expectation propagation for latent Dirichlet allocation
Tomonari Masada
PDF
Mini-batch Variational Inference for Time-Aware Topic Modeling
Tomonari Masada
PDF
A note on variational inference for the univariate Gaussian
Tomonari Masada
PDF
Document Modeling with Implicit Approximate Posterior Distributions
Tomonari Masada
PDF
LDA-Based Scoring of Sequences Generated by RNN for Automatic Tanka Composition
Tomonari Masada
PDF
A Note on ZINB-VAE
Tomonari Masada
PDF
A Note on Latent LSTM Allocation
Tomonari Masada
PDF
A Note on TopicRNN
Tomonari Masada
PDF
Topic modeling with Poisson factorization (2)
Tomonari Masada
PDF
Poisson factorization
Tomonari Masada
PPTX
A Simple Stochastic Gradient Variational Bayes for the Correlated Topic Model
Tomonari Masada
PPTX
A Simple Stochastic Gradient Variational Bayes for Latent Dirichlet Allocation
Tomonari Masada
TXT
Word count in Husserliana Volumes 1 to 28
Tomonari Masada
PDF
A Simple Stochastic Gradient Variational Bayes for Latent Dirichlet Allocation
Tomonari Masada
PDF
FDSE2015
Tomonari Masada
PDF
A derivation of the sampling formulas for An Entity-Topic Model for Entity Li...
Tomonari Masada
Learning Latent Space Energy Based Prior Modelの解説
Tomonari Masada
Context-dependent Token-wise Variational Autoencoder for Topic Modeling
Tomonari Masada
A note on the density of Gumbel-softmax
Tomonari Masada
トピックモデルの基礎と応用
Tomonari Masada
Expectation propagation for latent Dirichlet allocation
Tomonari Masada
Mini-batch Variational Inference for Time-Aware Topic Modeling
Tomonari Masada
A note on variational inference for the univariate Gaussian
Tomonari Masada
Document Modeling with Implicit Approximate Posterior Distributions
Tomonari Masada
LDA-Based Scoring of Sequences Generated by RNN for Automatic Tanka Composition
Tomonari Masada
A Note on ZINB-VAE
Tomonari Masada
A Note on Latent LSTM Allocation
Tomonari Masada
A Note on TopicRNN
Tomonari Masada
Topic modeling with Poisson factorization (2)
Tomonari Masada
Poisson factorization
Tomonari Masada
A Simple Stochastic Gradient Variational Bayes for the Correlated Topic Model
Tomonari Masada
A Simple Stochastic Gradient Variational Bayes for Latent Dirichlet Allocation
Tomonari Masada
Word count in Husserliana Volumes 1 to 28
Tomonari Masada
A Simple Stochastic Gradient Variational Bayes for Latent Dirichlet Allocation
Tomonari Masada
FDSE2015
Tomonari Masada
A derivation of the sampling formulas for An Entity-Topic Model for Entity Li...
Tomonari Masada
Ad
Denoising Diffusion Probabilistic Modelsの重要な式の解説
1.
Denoising Diffusion Probabilistic
Models 重要 式 解説 正田 備也 masada@rikkyo.ac.jp September 13, 2020 1 / 14
2.
q(xt|x0) 求 2 /
14
3.
q(x2|x0) = ∫ q(x2|x1)q(x1|x0)dx1 = d∏ j=1 ∫ q(x2,j|x1,j)q(x1,j|x0,j)dx1,j = d∏ j=1 ∫ 1 √ (2π)2β2β1 exp ( − (x2,j
− √ 1 − β2x1,j)2 2β2 − (x1,j − √ 1 − β1x0,j)2 2β1 ) dx1,j (1) exp(·) 中身 注目 。 (x2,j − √ 1 − β2x1,j)2 2β2 + (x1,j − √ 1 − β1x0,j)2 2β1 = (β1 + β2 − β1β2)x2 1,j − 2(β1 √ 1 − β2x2,j + β2 √ 1 − β1x0,j)x1,j + β1x2 2,j + β2(1 − β1)x2 0,j 2β1β2 = β1 + β2 − β1β2 2β1β2 {( x1,j − β1 √ 1 − β2x2,j + β2 √ 1 − β1x0,j β1 + β2 − β1β2 )2 − β2 1(1 − β2)x2 2,j + β2 2(1 − β1)x2 0,j + 2β1β2 √ (1 − β2)(1 − β1)x2,jx0,j (β1 + β2 − β1β2)2 + β1x2 2,j + β2x2 0,j β1 + β2 − β1β2 } 3 / 14
4.
= β1 + β2
− β1β2 2β1β2 {( x1,j − β1 √ 1 − β2x2,j + β2 √ 1 − β1x0,j β1 + β2 − β1β2 )2 + β1β2(x2 2,j − 2 √ (1 − β2)(1 − β1)x2,jx0,j + x2 0,j) (β1 + β2 − β1β2)2 } = β1 + β2 − β1β2 2β1β2 ( x1,j − β1 √ 1 − β2x2,j + β2 √ 1 − β1x0,j β1 + β2 − β1β2 )2 + (x2 2,j − 2 √ (1 − β2)(1 − β1)x2,jx0,j + x2 0,j) 2(β1 + β2 − β1β2) (2) ∫ exp ( − β1 + β2 − β1β2 2β1β2 ( x1,j − β1 √ 1 − β2x2,j + β2 √ 1 − β1x0,j β1 + β2 − β1β2 )2) dx1,j = √ 2πβ1β2 β1 + β2 − β1β2 (3) 4 / 14
5.
∫ q(x2,j|x1,j)q(x1,j|x0,j)dx1,j = 1 √ (2π)2β2β1 √ 2πβ1β2 β1 + β2
− β1β2 exp ( − (x2 2,j − 2 √ (1 − β2)(1 − β1)x2,jx0,j + x2 0,j) 2(β1 + β2 − β1β2) ) = 1 √ 2π(β1 + β2 − β1β2) exp ( − (x2 2,j − 2 √ (1 − β2)(1 − β1)x2,jx0,j + x2 0,j) 2(β1 + β2 − β1β2) ) (4) 以上 、 q(x2,j|x0,j) ∼ N( √ (1 − β2)(1 − β1)x0,j, β1 + β2 − β1β2) (5) 分 。 、αt = 1 − βt ¯αt = ∏t s=1 αs 、 q(x2,j|x0,j) ∼ N( √ ¯α2x0,j, 1 − ¯α2) (6) 。 j = 1, . . . , d 、 q(x2|x0) ∼ N( √ ¯α2x0, (1 − ¯α2)I) (7) 5 / 14
6.
q(x3|x0) = ∫ q(x3|x2)q(x2|x0)dx2 = d∏ j=1 ∫ q(x3,j|x2,j)q(x2,j|x0,j)dx2,j = d∏ j=1 ∫ 1 √ (2π)2β3(1
− ¯α2) exp ( − (x3,j − √ 1 − β3x2,j)2 2β3 − (x2,j − √ ¯α2x0,j)2 2(1 − ¯α2) ) dx2,j (8) q(x2|x0) 求 式 、β2 β3 、β1 1 − ¯α2 置 換 。 、 q(x3,j|x0,j) ∼ N( √ (1 − β3)¯α2x0,j, 1 − ¯α2 + β3 ¯α2) (9) 分 。(1 − β3)¯α2 = α3 ¯α2 = ¯α3 1 − ¯α2 + β3 ¯α2 = 1 − α3 ¯α2 = 1 − ¯α3 、 q(x3,j|x0,j) ∼ N( √ ¯α3x0,j, 1 − ¯α3) (10) 以下同様 考 q(xt|x0) ∼ N( √ ¯αtx0, (1 − ¯αt)I) (11) ( 、論文 式 (4) 通 。) 6 / 14
7.
q(xt−1|xt, x0) 求 7
/ 14
8.
q(xt−1|xt, x0) ∝
q(xt|xt−1)q(xt−1|x0) = d∏ j=1 q(xt,j|xt−1,j)q(xt−1,j|x0,j) = d∏ j=1 1 √ (2π)2βt(1 − ¯αt−1) exp ( − (xt,j − √ 1 − βtxt−1,j)2 2βt − (xt−1,j − √ ¯αt−1x0,j)2 2(1 − ¯αt−1) ) (12) (xt,j − √ 1 − βtxt−1,j)2 2βt + (xt−1,j − √ ¯αt−1x0,j)2 2(1 − ¯αt−1) = 1 − ¯αt−1 + βt − (1 − ¯αt−1)βt 2(1 − ¯αt−1)βt ( xt−1,j − (1 − ¯αt−1) √ 1 − βtxt,j + βt √ ¯αt−1x0,j 1 − ¯αt−1 + βt − (1 − ¯αt−1)βt )2 + const. = 1 − ¯αt 2(1 − ¯αt−1)βt ( xt−1,j − (1 − ¯αt−1) √ αtxt,j + βt √ ¯αt−1x0,j 1 − ¯αt )2 + const. (13) 8 / 14
9.
q(xt−1,j|xt,j, x0,j) ∼
N ((1 − ¯αt−1) √ αtxt,j + βt √ ¯αt−1x0,j 1 − ¯αt , (1 − ¯αt−1)βt 1 − ¯αt ) (14) j = 1, . . . , d 、 q(xt−1|xt, x0) ∼ N ((1 − ¯αt−1) √ αtxt + βt √ ¯αt−1x0 1 − ¯αt , (1 − ¯αt−1)βt 1 − ¯αt I ) (15) ( 、論文 式 (6) 式 (7) 通 。) 9 / 14
10.
ELBO 求 10 /
14
11.
ln p(x0) =
ln ∫ p(x0:T )dx1:T = ln ∫ p(xT ) T∏ t=1 p(xt−1|xt)dx1:T = ln ∫ q(x1:T |x0) p(xT ) ∏T t=1 p(xt−1|xt) q(x1:T |x0) dx1:T ≥ ∫ q(x1:T |x0) ln p(xT ) ∏T t=1 p(xt−1|xt) q(x1:T |x0) dx1:T = ∫ q(x1:T |x0) ln p(xT ) ∏T t=1 p(xt−1|xt) ∏T t=1 q(xt|xt−1) dx1:T = Eq [ ln p(xT ) + T∑ t=1 ln p(xt−1|xt) q(xt|xt−1) ] = Eq [ ln p(xT ) + T∑ t=2 ln p(xt−1|xt) q(xt|xt−1) + ln p(x0|x1) q(x1|x0) ] (16) 11 / 14
12.
q(xt−1|xt, x0) = q(xt,
xt−1|x0) q(xt|x0) = q(xt|xt−1, x0)q(xt−1|x0) q(xt|x0) = q(xt|xt−1)q(xt−1|x0) q(xt|x0) (17) 、最後 等号 性 仮定 、成 立 。 ∴ ln p(x0) ≥ Eq [ ln p(xT ) + T∑ t=2 ln p(xt−1|xt) q(xt−1|xt, x0) · q(xt−1|x0) q(xt|x0) + ln p(x0|x1) q(x1|x0) ] = Eq [ ln p(xT ) + T∑ t=2 ln p(xt−1|xt) q(xt−1|xt, x0) + T∑ t=2 ln q(xt−1|x0) − T∑ t=2 ln q(xt|x0) + ln p(x0|x1) q(x1|x0) ] = Eq [ ln p(xT ) + T∑ t=2 ln p(xt−1|xt) q(xt−1|xt, x0) + ln q(x1|x0) − ln q(xT |x0) + ln p(x0|x1) q(x1|x0) ] = Eq [ ln p(xT ) q(xT |x0) + T∑ t=2 ln p(xt−1|xt) q(xt−1|xt, x0) + ln p(x0|x1) ] (18) 12 / 14
13.
p(xt−1|xt) = ∏d j=1 1√ 2πσt exp ( − (xt−1,j −µj
(xt,t))2 2σ2 t ) 。 ln p(xt−1|xt) q(xt−1|xt, x0) = − d∑ j=1 (xt−1,j − µj(xt, t))2 2σ2 t + d∑ j=1 (xt−1,j − (1−¯αt−1) √ αtxt,j +βt √ ¯αt−1x0,j 1−¯αt )2 2(1−¯αt−1)βt 1−¯αt + const. (19) 論文 σ2 t = (1−¯αt−1)βt 1−¯αt 仮定 、 ln p(xt−1|xt) q(xt−1|xt, x0) = 1 2σ2 t d∑ j=1 [ 2xt−1,j ( µj(xt, t) − (1 − ¯αt−1) √ αtxt,j + βt √ ¯αt−1x0,j 1 − ¯αt ) − µj(xt, t)2 + ( (1 − ¯αt−1) √ αtxt,j + βt √ ¯αt−1x0,j 1 − ¯αt )2] + const. (20) 13 / 14
14.
∫ q(xt−1|xt, x0) ln p(xt−1|xt) q(xt−1|xt,
x0) dxt−1 = 1 2σ2 t d∑ j=1 [ 2 (1 − ¯αt−1) √ αtxt,j + βt √ ¯αt−1x0,j 1 − ¯αt ( µj(xt, t) − (1 − ¯αt−1) √ αtxt,j + βt √ ¯αt−1x0,j 1 − ¯αt ) − µj(xt, t)2 + ( (1 − ¯αt−1) √ αtxt,j + βt √ ¯αt−1x0,j 1 − ¯αt )2] + const. = − 1 2σ2 t d∑ j=1 ( µj(xt, t)2 − (1 − ¯αt−1) √ αtxt,j + βt √ ¯αt−1x0,j 1 − ¯αt )2 + const. (21) ( 、論文 式 (8) 符号 逆 。論文 negative log evidence upper bound 求 、 解説 log evidence lower bound 求 、符号 逆 。) 14 / 14
Download