SlideShare a Scribd company logo
Journal Review
Unsupervised Deep Learning Applied to Breast Density Segmentation
and Mammographic Risk Scoring
Jinseob Kim
December 27, 2017
Jinseob Kim Journal Review December 27, 2017 1 / 43
1 Introduction
2 Method
3 Result
4 Discussion
Jinseob Kim Journal Review December 27, 2017 2 / 43
Introduction
Introduction
Jinseob Kim Journal Review December 27, 2017 3 / 43
Introduction
The goal of this paper
Automatically learns features for images, which in our case are
mammograms
CSAE: convolutional sparse autoencoder
Sparse autoencoder within a convolutional architecture
Unlabelled data로 추상적인 feature학습 + label정보 이용해 분류: 2
optimization process
많은 unlabbeled data 이용 가능.
한번에 다 하는 것보다 Fast and stable
Jinseob Kim Journal Review December 27, 2017 4 / 43
Introduction
2 tasks
1 The automated segmentation of percentage mammographic
density (PMD)
2 Characterize mammographic textural (MT) patterns with the goal
of predicting whether a woman will develop breast cancer.
Structural information of breast tissue
Heterogeneity > density.
Manual vs automated
Harder than MD scoring,
The label of interest (healthy vs. diseased) is defined per image and not
per pixel (e.g., fatty vs. dense)
Jinseob Kim Journal Review December 27, 2017 5 / 43
Introduction
Model 요약
1 Multiscale denoising autoencoders
2 Convolutional architecture
3 Novel sparsity term to control the model capacity
Jinseob Kim Journal Review December 27, 2017 6 / 43
Introduction
Figure 1: Multiscale
Jinseob Kim Journal Review December 27, 2017 7 / 43
Method
Method
Jinseob Kim Journal Review December 27, 2017 8 / 43
Method
3 part
1 Generating input data
Multiscale
2 Model representation
CNN
3 Parameter learning
Sparse autoencoder: novel sparsity regularizer
Jinseob Kim Journal Review December 27, 2017 9 / 43
Method
Problem
1 Entire image를 input으로 쓰긴 어렵다..computational burden
2 Downsampling은 안된다. Fine scale로만 확인할 수 있는 feature..
Learn a compact representation for local neighbors (or patches) from
the image
Jinseob Kim Journal Review December 27, 2017 10 / 43
Method
Figure 2: Patch
Jinseob Kim Journal Review December 27, 2017 11 / 43
Method
Patch creation
1 Image resize: 50 pixel/mm
2 Sampling 48000 patches from whole data.
Restricted to 24pixel × 24pixel size
3 Density scoring
10% from the background and the pectoral muscle
45% from the fatty breast tissue
45% from the dense breast tissue.
4 Texture scoring
50% from the breast tissue of controls
50% from the breast tissue of cancer cases.
Jinseob Kim Journal Review December 27, 2017 12 / 43
Method
m = 24, M = 1, 4 layer: Conv + Maxpool + Conv + Conv
c = 1 (흑백): 칼라면 3
If t = 1, 인접한 m × m. t = 4, 더 큰 smoothed 그림에서 every 8th
pixel 선택.
K = {50, (50), 50, 100}, kernal size= {7, 2, 5, 5}
Jinseob Kim Journal Review December 27, 2017 13 / 43
Method
Multiscale input data
Capture long range interactions
Gaussian scale space
I(u; σt) = [I ∗ Gσt ](u)
Gσt =
1
2πσt
e
−(x2+y2)
σt
σt =
t−1
i=0
δ2i
δ : downsampling factor=2
Jinseob Kim Journal Review December 27, 2017 14 / 43
Method
Jinseob Kim Journal Review December 27, 2017 15 / 43
Method
Jinseob Kim Journal Review December 27, 2017 16 / 43
Method
Jinseob Kim Journal Review December 27, 2017 17 / 43
Method
Jinseob Kim Journal Review December 27, 2017 18 / 43
Method
Jinseob Kim Journal Review December 27, 2017 19 / 43
Method
3 step from z(l)
to z(l+1)
1 Extract sub-patches (called local receptive fields)
2 Feature learning: Learn transformation parameters (or features) by
autoencoding the local receptive fields
3 Feature encoding: Transform all local receptive fields using the learned
features from step 2.
Last: Softmax classifier(multinomial logistic regression)
Jinseob Kim Journal Review December 27, 2017 20 / 43
Method
Sparse autoencoder
전체 architecture에서 training할수도 있지만..
Use unsupervised learning: autoencoder
Sparse
Enables to learn a sparse overcomplete representation
Input보다 size가 큰 feature
Jinseob Kim Journal Review December 27, 2017 21 / 43
Method
Standard autoencoder
https://wikidocs.net/3413
Jinseob Kim Journal Review December 27, 2017 22 / 43
Method
Tied weight autoencoder
Input(r):Subpatch(d × d of c channel)
Encoder: K개 feature로 표현 (K < cd2)
a ≡ g(r) = φ(Wr + b)
φ(x) = max(0, x)
Decoder: K개 feature를 이용해서 다시 Input을 표현.
f (a) = ψ(W a + ˜b)
Tied weight
Jinseob Kim Journal Review December 27, 2017 23 / 43
Method
Learned features as input of next layer
1 subpatch 당 K차원 feature
1 patch 당 (m − d + 1) × (m − d + 1) × K 차원의 feature
Next layer의 input
Jinseob Kim Journal Review December 27, 2017 24 / 43
Method
Estimation
Jinseob Kim Journal Review December 27, 2017 25 / 43
Method
Sparse overcomplete representations
N(Basis Vectors) > dimensions of the input
K > cd2
This study, K = {50, (50), 50, 100}, kernal size= {7, 2, 5, 5}
http://mlsp.cs.cmu.edu/courses/fall2013/lectures/slides/class15.
sparseovercomplete.pdf
Jinseob Kim Journal Review December 27, 2017 26 / 43
Method
Type of Sparsity
1 Population sparsity: 적은 갯수의 feature로 input 설명
Compact encoding per example
2 Lifetime sparsity: 한 feature가 여러 input을 설명하는데 쓰이지 않음.
Example-specific features
Jinseob Kim Journal Review December 27, 2017 27 / 43
Method
Estimation: Novel sparsity term
Jinseob Kim Journal Review December 27, 2017 28 / 43
Method
Jinseob Kim Journal Review December 27, 2017 29 / 43
Method
Jinseob Kim Journal Review December 27, 2017 30 / 43
Method
Experiments and Datasets
2 different tasks (MD, MT)
First segmented mammograms into background, pectoral muscle,
and breast tissue region
3 different mammographic datasets
Density Dataset(493 mammograms of healthy women): radiologist
Texture Dataset(MMHS): trained observer
Dutch Breast Cancer Screening Dataset: Software(Volpara)
Jinseob Kim Journal Review December 27, 2017 31 / 43
Result
Result
Jinseob Kim Journal Review December 27, 2017 32 / 43
Result
MD: Density datasets
Initial output: 해당 pixel(patch)이 dense tissue class일 확률
빠른 training을 위해 dense tissue data를 많이 포함: overestimation
가능성
dense 판단 기준 threshold를 0.5에서 0.75로 올림
Jinseob Kim Journal Review December 27, 2017 33 / 43
Result
Image-wise average of the Dice coefficients
2 · |A ∩ B|
|A| + |B|
A: automated, B: radiologisat
Jinseob Kim Journal Review December 27, 2017 34 / 43
Result
Jinseob Kim Journal Review December 27, 2017 35 / 43
Result
MD: Dutch Breast Cancer Screening Dataset
Density dataset으로 얻은 모형을 이 dataset에 적용
Jinseob Kim Journal Review December 27, 2017 36 / 43
Result
MT: Texture Dataset
Initial output: 해당 pixel(patch)이 cancer class일 확률.
MT score per image: Breast area에서 patch 500개 랜덤으로 뽑아
Score 구한 후 평균.
랜덤으로 해도 AUC차이 0.01미만임을 확인.
Jinseob Kim Journal Review December 27, 2017 37 / 43
Result
MT: Dutch Breast Cancer Screening Dataset
Jinseob Kim Journal Review December 27, 2017 38 / 43
Discussion
Discussion
Jinseob Kim Journal Review December 27, 2017 39 / 43
Discussion
Summary
Present an unsupervised feature learning method for breast density
segmentation and automatic texture scoring.
Can learn useful features
After adapting a small set of hyperparameters (feature scales, output
size, and label classes), the CSAE model achieved state-of-the-art
results on each of the tasks
Jinseob Kim Journal Review December 27, 2017 40 / 43
Discussion
CSAE vs Classical CNN
The usage of unsupervised pre-training
성능증가
Jinseob Kim Journal Review December 27, 2017 41 / 43
Discussion
Limitation: MT scoring
하나의 mammogram은 어떤 위치든 다 같은 label: Case vs Control
Assumed that texture changes are systemic and occur at many
locations in the tissue
Jinseob Kim Journal Review December 27, 2017 42 / 43
Discussion
Conclusion: Future idea
Image의 여러 부분 정보(patch)를 합쳐서 하나의 label로 매핑.
더 많은 데이터, 컴퓨팅 성능 필요
Easily adjusted to support 3D data
Jinseob Kim Journal Review December 27, 2017 43 / 43

More Related Content

PDF
An Image representation using Compressive Sensing and Arithmetic Coding
PDF
KDD Poster Nurjahan Begum
PDF
Paper Summary of Beta-VAE: Learning Basic Visual Concepts with a Constrained ...
PDF
Block coordinate descent__in_computer_vision
PDF
Paper Summary of Disentangling by Factorising (Factor-VAE)
PDF
Network Deconvolution review [cdm]
PDF
PDF
MobileNet V3
An Image representation using Compressive Sensing and Arithmetic Coding
KDD Poster Nurjahan Begum
Paper Summary of Beta-VAE: Learning Basic Visual Concepts with a Constrained ...
Block coordinate descent__in_computer_vision
Paper Summary of Disentangling by Factorising (Factor-VAE)
Network Deconvolution review [cdm]
MobileNet V3

What's hot (7)

PDF
[Paper] GIRAFFE: Representing Scenes as Compositional Generative Neural Featu...
PPTX
Deep learning for image super resolution
PDF
Pixel Recursive Super Resolution. Google Brain
PDF
THE EVIDENCE THEORY FOR COLOR SATELLITE IMAGE COMPRESSION
PPTX
AutoEncoder&GAN Introduction
PDF
Exact Inference in Bayesian Networks using MapReduce (Hadoop Summit 2010)
PPT
Progressive Texture
[Paper] GIRAFFE: Representing Scenes as Compositional Generative Neural Featu...
Deep learning for image super resolution
Pixel Recursive Super Resolution. Google Brain
THE EVIDENCE THEORY FOR COLOR SATELLITE IMAGE COMPRESSION
AutoEncoder&GAN Introduction
Exact Inference in Bayesian Networks using MapReduce (Hadoop Summit 2010)
Progressive Texture
Ad

Similar to Unsupervised Deep Learning Applied to Breast Density Segmentation and Mammographic Risk Scoring (20)

PDF
Classification of Breast Masses Using Convolutional Neural Network as Feature...
PPTX
Second subjective assignment
PDF
A Review on Color Recognition using Deep Learning and Different Image Segment...
PDF
Study of relevancy, diversity, and novelty in recommender systems
PDF
D05222528
PDF
IRJET - Factors Affecting Deployment of Deep Learning based Face Recognition ...
PDF
A Novel DBSCAN Approach to Identify Microcalcifications in Cancer Images with...
PDF
MAGNETIC RESONANCE BRAIN IMAGE SEGMENTATION
PDF
V.KARTHIKEYAN PUBLISHED ARTICLE
PDF
Brain Tumor Classification using Support Vector Machine
PDF
Wheat leaf disease detection using image processing
PDF
FACE RECOGNITION USING DIFFERENT LOCAL FEATURES WITH DIFFERENT DISTANCE TECHN...
PDF
J25043046
PDF
J25043046
PDF
Automatic Attendance System using Deep Learning Framework
PPTX
(Machine Learning) Ensemble learning
PDF
Ag044216224
PDF
Application of deep leaning to computer vision
PDF
Soft Computing Techniques Based Image Classification using Support Vector Mac...
PDF
H0114857
Classification of Breast Masses Using Convolutional Neural Network as Feature...
Second subjective assignment
A Review on Color Recognition using Deep Learning and Different Image Segment...
Study of relevancy, diversity, and novelty in recommender systems
D05222528
IRJET - Factors Affecting Deployment of Deep Learning based Face Recognition ...
A Novel DBSCAN Approach to Identify Microcalcifications in Cancer Images with...
MAGNETIC RESONANCE BRAIN IMAGE SEGMENTATION
V.KARTHIKEYAN PUBLISHED ARTICLE
Brain Tumor Classification using Support Vector Machine
Wheat leaf disease detection using image processing
FACE RECOGNITION USING DIFFERENT LOCAL FEATURES WITH DIFFERENT DISTANCE TECHN...
J25043046
J25043046
Automatic Attendance System using Deep Learning Framework
(Machine Learning) Ensemble learning
Ag044216224
Application of deep leaning to computer vision
Soft Computing Techniques Based Image Classification using Support Vector Mac...
H0114857
Ad

More from Jinseob Kim (20)

PDF
Fst, selection index
PDF
Why Does Deep and Cheap Learning Work So Well
PDF
괴델(Godel)의 불완전성 정리 증명의 이해.
PDF
New Epidemiologic Measures in Multilevel Study: Median Risk Ratio, Median Haz...
DOCX
가설검정의 심리학
PDF
Win Above Replacement in Sabermetrics
PDF
Regression Basic : MLE
PDF
iHS calculation in R
PDF
Fst in R
PDF
Selection index population_genetics
PDF
질병부담계산: Dismod mr gbd2010
PDF
DALY & QALY
PDF
Case-crossover study
PDF
Generalized Additive Model
PDF
Deep Learning by JSKIM (Korean)
PDF
Machine Learning Introduction
PDF
Tree advanced
PDF
Deep learning by JSKIM
PDF
Main result
PDF
Multilevel study
Fst, selection index
Why Does Deep and Cheap Learning Work So Well
괴델(Godel)의 불완전성 정리 증명의 이해.
New Epidemiologic Measures in Multilevel Study: Median Risk Ratio, Median Haz...
가설검정의 심리학
Win Above Replacement in Sabermetrics
Regression Basic : MLE
iHS calculation in R
Fst in R
Selection index population_genetics
질병부담계산: Dismod mr gbd2010
DALY & QALY
Case-crossover study
Generalized Additive Model
Deep Learning by JSKIM (Korean)
Machine Learning Introduction
Tree advanced
Deep learning by JSKIM
Main result
Multilevel study

Recently uploaded (20)

PPT
POSITIONING IN OPERATION THEATRE ROOM.ppt
PDF
Formation of Supersonic Turbulence in the Primordial Star-forming Cloud
PDF
The scientific heritage No 166 (166) (2025)
PPTX
Cell Membrane: Structure, Composition & Functions
PDF
lecture 2026 of Sjogren's syndrome l .pdf
PPTX
BIOMOLECULES PPT........................
PDF
CAPERS-LRD-z9:AGas-enshroudedLittleRedDotHostingaBroad-lineActive GalacticNuc...
PDF
Unveiling a 36 billion solar mass black hole at the centre of the Cosmic Hors...
PDF
ELS_Q1_Module-11_Formation-of-Rock-Layers_v2.pdf
PPTX
Microbiology with diagram medical studies .pptx
PPTX
ANEMIA WITH LEUKOPENIA MDS 07_25.pptx htggtftgt fredrctvg
PPTX
Derivatives of integument scales, beaks, horns,.pptx
PPTX
neck nodes and dissection types and lymph nodes levels
PDF
Mastering Bioreactors and Media Sterilization: A Complete Guide to Sterile Fe...
PDF
HPLC-PPT.docx high performance liquid chromatography
PDF
An interstellar mission to test astrophysical black holes
PPTX
ognitive-behavioral therapy, mindfulness-based approaches, coping skills trai...
PDF
AlphaEarth Foundations and the Satellite Embedding dataset
PPTX
2Systematics of Living Organisms t-.pptx
PPTX
Introduction to Cardiovascular system_structure and functions-1
POSITIONING IN OPERATION THEATRE ROOM.ppt
Formation of Supersonic Turbulence in the Primordial Star-forming Cloud
The scientific heritage No 166 (166) (2025)
Cell Membrane: Structure, Composition & Functions
lecture 2026 of Sjogren's syndrome l .pdf
BIOMOLECULES PPT........................
CAPERS-LRD-z9:AGas-enshroudedLittleRedDotHostingaBroad-lineActive GalacticNuc...
Unveiling a 36 billion solar mass black hole at the centre of the Cosmic Hors...
ELS_Q1_Module-11_Formation-of-Rock-Layers_v2.pdf
Microbiology with diagram medical studies .pptx
ANEMIA WITH LEUKOPENIA MDS 07_25.pptx htggtftgt fredrctvg
Derivatives of integument scales, beaks, horns,.pptx
neck nodes and dissection types and lymph nodes levels
Mastering Bioreactors and Media Sterilization: A Complete Guide to Sterile Fe...
HPLC-PPT.docx high performance liquid chromatography
An interstellar mission to test astrophysical black holes
ognitive-behavioral therapy, mindfulness-based approaches, coping skills trai...
AlphaEarth Foundations and the Satellite Embedding dataset
2Systematics of Living Organisms t-.pptx
Introduction to Cardiovascular system_structure and functions-1

Unsupervised Deep Learning Applied to Breast Density Segmentation and Mammographic Risk Scoring

  • 1. Journal Review Unsupervised Deep Learning Applied to Breast Density Segmentation and Mammographic Risk Scoring Jinseob Kim December 27, 2017 Jinseob Kim Journal Review December 27, 2017 1 / 43
  • 2. 1 Introduction 2 Method 3 Result 4 Discussion Jinseob Kim Journal Review December 27, 2017 2 / 43
  • 3. Introduction Introduction Jinseob Kim Journal Review December 27, 2017 3 / 43
  • 4. Introduction The goal of this paper Automatically learns features for images, which in our case are mammograms CSAE: convolutional sparse autoencoder Sparse autoencoder within a convolutional architecture Unlabelled data로 추상적인 feature학습 + label정보 이용해 분류: 2 optimization process 많은 unlabbeled data 이용 가능. 한번에 다 하는 것보다 Fast and stable Jinseob Kim Journal Review December 27, 2017 4 / 43
  • 5. Introduction 2 tasks 1 The automated segmentation of percentage mammographic density (PMD) 2 Characterize mammographic textural (MT) patterns with the goal of predicting whether a woman will develop breast cancer. Structural information of breast tissue Heterogeneity > density. Manual vs automated Harder than MD scoring, The label of interest (healthy vs. diseased) is defined per image and not per pixel (e.g., fatty vs. dense) Jinseob Kim Journal Review December 27, 2017 5 / 43
  • 6. Introduction Model 요약 1 Multiscale denoising autoencoders 2 Convolutional architecture 3 Novel sparsity term to control the model capacity Jinseob Kim Journal Review December 27, 2017 6 / 43
  • 7. Introduction Figure 1: Multiscale Jinseob Kim Journal Review December 27, 2017 7 / 43
  • 8. Method Method Jinseob Kim Journal Review December 27, 2017 8 / 43
  • 9. Method 3 part 1 Generating input data Multiscale 2 Model representation CNN 3 Parameter learning Sparse autoencoder: novel sparsity regularizer Jinseob Kim Journal Review December 27, 2017 9 / 43
  • 10. Method Problem 1 Entire image를 input으로 쓰긴 어렵다..computational burden 2 Downsampling은 안된다. Fine scale로만 확인할 수 있는 feature.. Learn a compact representation for local neighbors (or patches) from the image Jinseob Kim Journal Review December 27, 2017 10 / 43
  • 11. Method Figure 2: Patch Jinseob Kim Journal Review December 27, 2017 11 / 43
  • 12. Method Patch creation 1 Image resize: 50 pixel/mm 2 Sampling 48000 patches from whole data. Restricted to 24pixel × 24pixel size 3 Density scoring 10% from the background and the pectoral muscle 45% from the fatty breast tissue 45% from the dense breast tissue. 4 Texture scoring 50% from the breast tissue of controls 50% from the breast tissue of cancer cases. Jinseob Kim Journal Review December 27, 2017 12 / 43
  • 13. Method m = 24, M = 1, 4 layer: Conv + Maxpool + Conv + Conv c = 1 (흑백): 칼라면 3 If t = 1, 인접한 m × m. t = 4, 더 큰 smoothed 그림에서 every 8th pixel 선택. K = {50, (50), 50, 100}, kernal size= {7, 2, 5, 5} Jinseob Kim Journal Review December 27, 2017 13 / 43
  • 14. Method Multiscale input data Capture long range interactions Gaussian scale space I(u; σt) = [I ∗ Gσt ](u) Gσt = 1 2πσt e −(x2+y2) σt σt = t−1 i=0 δ2i δ : downsampling factor=2 Jinseob Kim Journal Review December 27, 2017 14 / 43
  • 15. Method Jinseob Kim Journal Review December 27, 2017 15 / 43
  • 16. Method Jinseob Kim Journal Review December 27, 2017 16 / 43
  • 17. Method Jinseob Kim Journal Review December 27, 2017 17 / 43
  • 18. Method Jinseob Kim Journal Review December 27, 2017 18 / 43
  • 19. Method Jinseob Kim Journal Review December 27, 2017 19 / 43
  • 20. Method 3 step from z(l) to z(l+1) 1 Extract sub-patches (called local receptive fields) 2 Feature learning: Learn transformation parameters (or features) by autoencoding the local receptive fields 3 Feature encoding: Transform all local receptive fields using the learned features from step 2. Last: Softmax classifier(multinomial logistic regression) Jinseob Kim Journal Review December 27, 2017 20 / 43
  • 21. Method Sparse autoencoder 전체 architecture에서 training할수도 있지만.. Use unsupervised learning: autoencoder Sparse Enables to learn a sparse overcomplete representation Input보다 size가 큰 feature Jinseob Kim Journal Review December 27, 2017 21 / 43
  • 23. Method Tied weight autoencoder Input(r):Subpatch(d × d of c channel) Encoder: K개 feature로 표현 (K < cd2) a ≡ g(r) = φ(Wr + b) φ(x) = max(0, x) Decoder: K개 feature를 이용해서 다시 Input을 표현. f (a) = ψ(W a + ˜b) Tied weight Jinseob Kim Journal Review December 27, 2017 23 / 43
  • 24. Method Learned features as input of next layer 1 subpatch 당 K차원 feature 1 patch 당 (m − d + 1) × (m − d + 1) × K 차원의 feature Next layer의 input Jinseob Kim Journal Review December 27, 2017 24 / 43
  • 25. Method Estimation Jinseob Kim Journal Review December 27, 2017 25 / 43
  • 26. Method Sparse overcomplete representations N(Basis Vectors) > dimensions of the input K > cd2 This study, K = {50, (50), 50, 100}, kernal size= {7, 2, 5, 5} http://mlsp.cs.cmu.edu/courses/fall2013/lectures/slides/class15. sparseovercomplete.pdf Jinseob Kim Journal Review December 27, 2017 26 / 43
  • 27. Method Type of Sparsity 1 Population sparsity: 적은 갯수의 feature로 input 설명 Compact encoding per example 2 Lifetime sparsity: 한 feature가 여러 input을 설명하는데 쓰이지 않음. Example-specific features Jinseob Kim Journal Review December 27, 2017 27 / 43
  • 28. Method Estimation: Novel sparsity term Jinseob Kim Journal Review December 27, 2017 28 / 43
  • 29. Method Jinseob Kim Journal Review December 27, 2017 29 / 43
  • 30. Method Jinseob Kim Journal Review December 27, 2017 30 / 43
  • 31. Method Experiments and Datasets 2 different tasks (MD, MT) First segmented mammograms into background, pectoral muscle, and breast tissue region 3 different mammographic datasets Density Dataset(493 mammograms of healthy women): radiologist Texture Dataset(MMHS): trained observer Dutch Breast Cancer Screening Dataset: Software(Volpara) Jinseob Kim Journal Review December 27, 2017 31 / 43
  • 32. Result Result Jinseob Kim Journal Review December 27, 2017 32 / 43
  • 33. Result MD: Density datasets Initial output: 해당 pixel(patch)이 dense tissue class일 확률 빠른 training을 위해 dense tissue data를 많이 포함: overestimation 가능성 dense 판단 기준 threshold를 0.5에서 0.75로 올림 Jinseob Kim Journal Review December 27, 2017 33 / 43
  • 34. Result Image-wise average of the Dice coefficients 2 · |A ∩ B| |A| + |B| A: automated, B: radiologisat Jinseob Kim Journal Review December 27, 2017 34 / 43
  • 35. Result Jinseob Kim Journal Review December 27, 2017 35 / 43
  • 36. Result MD: Dutch Breast Cancer Screening Dataset Density dataset으로 얻은 모형을 이 dataset에 적용 Jinseob Kim Journal Review December 27, 2017 36 / 43
  • 37. Result MT: Texture Dataset Initial output: 해당 pixel(patch)이 cancer class일 확률. MT score per image: Breast area에서 patch 500개 랜덤으로 뽑아 Score 구한 후 평균. 랜덤으로 해도 AUC차이 0.01미만임을 확인. Jinseob Kim Journal Review December 27, 2017 37 / 43
  • 38. Result MT: Dutch Breast Cancer Screening Dataset Jinseob Kim Journal Review December 27, 2017 38 / 43
  • 39. Discussion Discussion Jinseob Kim Journal Review December 27, 2017 39 / 43
  • 40. Discussion Summary Present an unsupervised feature learning method for breast density segmentation and automatic texture scoring. Can learn useful features After adapting a small set of hyperparameters (feature scales, output size, and label classes), the CSAE model achieved state-of-the-art results on each of the tasks Jinseob Kim Journal Review December 27, 2017 40 / 43
  • 41. Discussion CSAE vs Classical CNN The usage of unsupervised pre-training 성능증가 Jinseob Kim Journal Review December 27, 2017 41 / 43
  • 42. Discussion Limitation: MT scoring 하나의 mammogram은 어떤 위치든 다 같은 label: Case vs Control Assumed that texture changes are systemic and occur at many locations in the tissue Jinseob Kim Journal Review December 27, 2017 42 / 43
  • 43. Discussion Conclusion: Future idea Image의 여러 부분 정보(patch)를 합쳐서 하나의 label로 매핑. 더 많은 데이터, 컴퓨팅 성능 필요 Easily adjusted to support 3D data Jinseob Kim Journal Review December 27, 2017 43 / 43