SlideShare a Scribd company logo
Confident Kernel Sparse Coding
and Dictionary Learning
Babak Hosseini
Prof. Dr. Barbara Hammer
Singapore, 20 Nov. 2018
bhosseini@techfak.uni-bielefeld.de
Cognitive Interaction Technology Centre of Excellence (CITEC)
Bielefeld University, Germany
Outlines
• Introduction
• Confident Dictionary Learning
• Experiments
• Conclusion
2
Introduction
• Introduction
• Confident Dictionary Learning
• Experiments
• Conclusion
3
Introduction
• Dictionary learning and sparse coding
• X: Input signals
• U: Dictionary matrix
• Γ: Sparse codes
• Reconstructing X using sparse resources from U
4
Introduction
• Dictionary learning and sparse coding
• X: Input signals
• U: Dictionary matrix
• Γ: Sparse codes
• Reconstructing X using sparse resources from U
5
x U γ
Introduction
• Dictionary learning and sparse coding
• X: Input signals
• U: Dictionary matrix
• Γ: Sparse codes
• Estimating Γ Sparse coding
• Estimating U Dictionary learning
6
Introduction
• Kernel Dictionary learning/sparse coding
• Non-vectorial data
• Time-series
• Representation: Kernel matrix
• Pairwise-similarity/distance (x1,x2)
7
Introduction
• Kernel Dictionary learning/sparse coding
• Implicit mapping
• Dictionary:
• Linear combination of training data
8
Introduction
• Discriminative dictionary learning (DDL)
• Classification setting
• Label matrix
• Goal:
• Learn a Dictionary U that reconstructs X via Γ
• Mapping Γ: X L
9
Introduction
• Discriminative dictionary learning (DDL)
• Goal:
• Learn a Dictionary U that reconstructs X via Γ
• Mapping Γ: X L
• : Ensures the discriminative mapping for training data
• : Has access to label information L
10
Introduction
• What is the problem?
11
Introduction
• What is the problem?
• The test and train models are not consistent!
12
Introduction
• What is the problem?
• The test and train models are not consistent!
• Reconstruction of test data
doesn’t follow the discriminative mapping.
13
!
Outlines
• Introduction
• Confident Dictionary Learning
• Experiments
• Conclusion
14
Confident Dictionary Learning
• A new discriminant objective
•
• What is the point of that?!
15
Confident Dictionary Learning
• A new discriminant objective
•
• Reconstruction model:
• :
• Its entries show the share of each class in the reconstruction of x.
16
Confident Dictionary Learning
• A new discriminant objective
•
• Reconstruction model:
Therefore,
• Mapping to label space:
17
Confident Dictionary Learning
• A new discriminant objective
•
• Minimizing
each x is reconstructed mostly by its own class.
• Flexible term: x still can use other classes (minor share)
18
Confident Dictionary Learning
• Consistency?
19
Confident Dictionary Learning
• Classification of test data:
• z: test data
• Label :
• Class j: highest contribution in the reconstruction
20
• Classification of test data:
• z: test data
• Class j: highest contribution in the reconstruction
•
• Minimizing
Forcing to reconstruct z using less number of classes.
• Flexible: can still use small share of other classes (if required).
• Confident toward one class.
Confident Dictionary Learning
21
Confident Dictionary Learning
• Convexity?
• Not convex!
22
Confident Dictionary Learning
• Convexity?
• Not convex!
• β: -{most negative eigenvalue of V}
• Once before the training!
23
Confident Dictionary Learning
• Consistent?
• Recall uses L too.
• Recall objective term
similar to the train’s
• Flexible contributions
• Confidence criteria
• More consistent
24
Outlines
• Introduction
• Confident Dictionary Learning
• Experiments
• Conclusion
25
Experiments
• Datasets:
• Multi-dimensional time-series
• Cricket Umpire [1]:
• Articulatory Words [2]
• Schunk Dexterous [3]:
• UTKinect Actions [4]:
• DynTex++[5]:
[1] M. H. Ko, et al. G. W. West, S. Venkatesh, and M. Kumar, “Online context recognition in multisensor systems using dynamic
time warping,” in ISSNIP’05. IEEE, 2005, pp. 283–288.
[2] J. Wang, A. Samal, and J. Green, “Preliminary test of a real-time, interactive silent speech interface based on electromagnetic
articulograph,” in SLPAT’14, 2014, pp. 38–45.
[3] A. Drimus, G. Kootstra, A. Bilberg, and D. Kragic, “Design of a flexible tactile sensor for classification of rigid and deformable
objects,” Robotics and Autonomous Systems, vol. 62, no. 1, pp. 3–15, 2014.
[4] M. Madry, L. Bo, D. Kragic, and D. Fox, “St-hmp: Unsupervised spatiotemporal feature learning for tactile data,” in ICRA’14.
IEEE, 2014, pp. 2262–2269.
[5] L. Xia, C.-C. Chen, and J. Aggarwal, “View invariant human action recognition using histograms of 3d joints,” in CVPRW’12
Workshops. IEEE, 2012, pp. 20–27.
[6] B. Ghanem and N. Ahuja, “Maximum margin distance learning for dynamic texture recognition,” in ECCV’10. Springer, 2010,
26
Experiments
• Baselines:
• K-KSVD: Kernel K-SVD
• LC-NNKSC: Predecessor of CKSC
• EKDL: Equiangular kernel dictionary learning
• KGDL: Dictionary learning on grassmann manifolds
• LP-KSVD: Locality preserving K-SVD
27
Experiments
• Baselines:
• K-KSVD: Kernel K-SVD
• LC-NNKSC: Predecessor of CKSC
• EKDL: Equiangular kernel dictionary learning
• KGDL: Dictionary learning on grassmann manifolds
• LP-KSVD: Locality preserving K-SVD
• Classification Accuracy (%):
• “Dictionary discrimination power”
28
Experiments
• Interpretability of atoms (IP):
• Dictionary:
•
• In range [1/c 1]
• atom i
• 1 if atom i belongs only to one class
• 1/c if atom i is related to all classes
29
Experiments
• Interpretability of atoms (IP):
•
• In range [1/c 1]
• 1 if atom i belongs only to one class
• 1/c if atom i is related to all classes
• Good discrimination
• Good interpretation
30
Conclusion
• Consistency is important for DDL models.
• Proposed a flexible discriminant terms for DDL.
• Proposed a more consistent training-recall framework.
• Increase in:
• Discriminative performance
• Interpretability of dictionary atoms.
31
Thank you very much!
3232

More Related Content

PPTX
Knowledge graphs for knowing more and knowing for sure
PPT
Word2vector
PDF
Introduction_to_knowledge_graph.pdf
PDF
Transfer Learning for Natural Language Processing
PDF
Chris Dyer - 2017 - Neural MT Workshop Invited Talk: The Neural Noisy Channel...
PDF
Sybrandt Thesis Proposal Presentation
PPTX
Deep Learning @ ZHAW Datalab (with Mark Cieliebak & Yves Pauchard)
PPTX
StyleCLIP: Text-Driven Manipulation of StyleGAN Imagery
Knowledge graphs for knowing more and knowing for sure
Word2vector
Introduction_to_knowledge_graph.pdf
Transfer Learning for Natural Language Processing
Chris Dyer - 2017 - Neural MT Workshop Invited Talk: The Neural Noisy Channel...
Sybrandt Thesis Proposal Presentation
Deep Learning @ ZHAW Datalab (with Mark Cieliebak & Yves Pauchard)
StyleCLIP: Text-Driven Manipulation of StyleGAN Imagery

Similar to Confident Kernel Sparse Coding and Dictionary Learning (20)

PDF
Easing embedding learning by comprehensive transcription of heterogeneous inf...
PDF
Formalizing Mathematics in Lean
PPT
Word 2 vector
DOC
VIBHA RESUME
PPT
Teaching Object Oriented Programming Courses by Sandeep K Singh JIIT,Noida
PPTX
k-Nearest Neighbors with brief explanation.pptx
PPTX
Information Retrieval and Extraction - Module 7
PDF
李俊良/Feature Engineering in Machine Learning
PPTX
Harnessing Textbooks for High-Quality Labeled Data: An Approach to Automatic ...
PPTX
1c-Hypohhhhhhhhhhhhhhhhthesis-space-17May.pptx
PDF
bag-of-words models
PDF
Deep Learning for Computer Vision: Unsupervised Learning (UPC 2016)
PPTX
LSI latent (par HATOUM Saria et DONGO ESCALANTE Irvin Franco)
PDF
Transfer Learning -- The Next Frontier for Machine Learning
PPT
16 17 bag_words
PPTX
Teaching Open Web Mapping - AAG 2017
PPTX
Microsoft PROSE SDK: A Framework for Inductive Program Synthesis
PPTX
Designing creative electronic books for mathematical creativity
PPTX
Topic Extraction on Domain Ontology
PDF
DBR (Design-Based Research) in mobile learning-Mlearn2013 Doha A_Palalas C_G...
Easing embedding learning by comprehensive transcription of heterogeneous inf...
Formalizing Mathematics in Lean
Word 2 vector
VIBHA RESUME
Teaching Object Oriented Programming Courses by Sandeep K Singh JIIT,Noida
k-Nearest Neighbors with brief explanation.pptx
Information Retrieval and Extraction - Module 7
李俊良/Feature Engineering in Machine Learning
Harnessing Textbooks for High-Quality Labeled Data: An Approach to Automatic ...
1c-Hypohhhhhhhhhhhhhhhhthesis-space-17May.pptx
bag-of-words models
Deep Learning for Computer Vision: Unsupervised Learning (UPC 2016)
LSI latent (par HATOUM Saria et DONGO ESCALANTE Irvin Franco)
Transfer Learning -- The Next Frontier for Machine Learning
16 17 bag_words
Teaching Open Web Mapping - AAG 2017
Microsoft PROSE SDK: A Framework for Inductive Program Synthesis
Designing creative electronic books for mathematical creativity
Topic Extraction on Domain Ontology
DBR (Design-Based Research) in mobile learning-Mlearn2013 Doha A_Palalas C_G...
Ad

More from babak hosseini (6)

PPTX
Deep-Aligned CNN
PPTX
Interpretable Multiple-Kernel Prototype Learning for Discriminative Represent...
PPTX
Large-Margin Multiple Kernel Learning for Discriminative Features Selection a...
PPTX
Interpretable Discriminative Dimensionality Reduction and Feature Selection o...
PDF
ESANN2019 Slides
PPTX
Non-Negative Local Sparse Coding for Subspace Clustering
Deep-Aligned CNN
Interpretable Multiple-Kernel Prototype Learning for Discriminative Represent...
Large-Margin Multiple Kernel Learning for Discriminative Features Selection a...
Interpretable Discriminative Dimensionality Reduction and Feature Selection o...
ESANN2019 Slides
Non-Negative Local Sparse Coding for Subspace Clustering
Ad

Recently uploaded (20)

PPTX
Introduction-to-Food-Packaging-and-packaging -materials.pptx
PPTX
_ISO_Presentation_ISO 9001 and 45001.pptx
PPTX
Effective_Handling_Information_Presentation.pptx
PPTX
Non-Verbal-Communication .mh.pdf_110245_compressed.pptx
PPTX
The Effect of Human Resource Management Practice on Organizational Performanc...
PPTX
lesson6-211001025531lesson plan ppt.pptx
PDF
Swiggy’s Playbook: UX, Logistics & Monetization
PPTX
Tour Presentation Educational Activity.pptx
DOCX
"Project Management: Ultimate Guide to Tools, Techniques, and Strategies (2025)"
PPTX
Anesthesia and it's stage with mnemonic and images
PPTX
PHIL.-ASTRONOMY-AND-NAVIGATION of ..pptx
PPTX
BIOLOGY TISSUE PPT CLASS 9 PROJECT PUBLIC
PPTX
ART-APP-REPORT-FINctrwxsg f fuy L-na.pptx
PPTX
Tablets And Capsule Preformulation Of Paracetamol
PPTX
An Unlikely Response 08 10 2025.pptx
PPTX
fundraisepro pitch deck elegant and modern
PPTX
worship songs, in any order, compilation
PPTX
Self management and self evaluation presentation
PPTX
Learning-Plan-5-Policies-and-Practices.pptx
PPTX
chapter8-180915055454bycuufucdghrwtrt.pptx
Introduction-to-Food-Packaging-and-packaging -materials.pptx
_ISO_Presentation_ISO 9001 and 45001.pptx
Effective_Handling_Information_Presentation.pptx
Non-Verbal-Communication .mh.pdf_110245_compressed.pptx
The Effect of Human Resource Management Practice on Organizational Performanc...
lesson6-211001025531lesson plan ppt.pptx
Swiggy’s Playbook: UX, Logistics & Monetization
Tour Presentation Educational Activity.pptx
"Project Management: Ultimate Guide to Tools, Techniques, and Strategies (2025)"
Anesthesia and it's stage with mnemonic and images
PHIL.-ASTRONOMY-AND-NAVIGATION of ..pptx
BIOLOGY TISSUE PPT CLASS 9 PROJECT PUBLIC
ART-APP-REPORT-FINctrwxsg f fuy L-na.pptx
Tablets And Capsule Preformulation Of Paracetamol
An Unlikely Response 08 10 2025.pptx
fundraisepro pitch deck elegant and modern
worship songs, in any order, compilation
Self management and self evaluation presentation
Learning-Plan-5-Policies-and-Practices.pptx
chapter8-180915055454bycuufucdghrwtrt.pptx

Confident Kernel Sparse Coding and Dictionary Learning

  • 1. Confident Kernel Sparse Coding and Dictionary Learning Babak Hosseini Prof. Dr. Barbara Hammer Singapore, 20 Nov. 2018 bhosseini@techfak.uni-bielefeld.de Cognitive Interaction Technology Centre of Excellence (CITEC) Bielefeld University, Germany
  • 2. Outlines • Introduction • Confident Dictionary Learning • Experiments • Conclusion 2
  • 3. Introduction • Introduction • Confident Dictionary Learning • Experiments • Conclusion 3
  • 4. Introduction • Dictionary learning and sparse coding • X: Input signals • U: Dictionary matrix • Γ: Sparse codes • Reconstructing X using sparse resources from U 4
  • 5. Introduction • Dictionary learning and sparse coding • X: Input signals • U: Dictionary matrix • Γ: Sparse codes • Reconstructing X using sparse resources from U 5 x U γ
  • 6. Introduction • Dictionary learning and sparse coding • X: Input signals • U: Dictionary matrix • Γ: Sparse codes • Estimating Γ Sparse coding • Estimating U Dictionary learning 6
  • 7. Introduction • Kernel Dictionary learning/sparse coding • Non-vectorial data • Time-series • Representation: Kernel matrix • Pairwise-similarity/distance (x1,x2) 7
  • 8. Introduction • Kernel Dictionary learning/sparse coding • Implicit mapping • Dictionary: • Linear combination of training data 8
  • 9. Introduction • Discriminative dictionary learning (DDL) • Classification setting • Label matrix • Goal: • Learn a Dictionary U that reconstructs X via Γ • Mapping Γ: X L 9
  • 10. Introduction • Discriminative dictionary learning (DDL) • Goal: • Learn a Dictionary U that reconstructs X via Γ • Mapping Γ: X L • : Ensures the discriminative mapping for training data • : Has access to label information L 10
  • 11. Introduction • What is the problem? 11
  • 12. Introduction • What is the problem? • The test and train models are not consistent! 12
  • 13. Introduction • What is the problem? • The test and train models are not consistent! • Reconstruction of test data doesn’t follow the discriminative mapping. 13 !
  • 14. Outlines • Introduction • Confident Dictionary Learning • Experiments • Conclusion 14
  • 15. Confident Dictionary Learning • A new discriminant objective • • What is the point of that?! 15
  • 16. Confident Dictionary Learning • A new discriminant objective • • Reconstruction model: • : • Its entries show the share of each class in the reconstruction of x. 16
  • 17. Confident Dictionary Learning • A new discriminant objective • • Reconstruction model: Therefore, • Mapping to label space: 17
  • 18. Confident Dictionary Learning • A new discriminant objective • • Minimizing each x is reconstructed mostly by its own class. • Flexible term: x still can use other classes (minor share) 18
  • 20. Confident Dictionary Learning • Classification of test data: • z: test data • Label : • Class j: highest contribution in the reconstruction 20
  • 21. • Classification of test data: • z: test data • Class j: highest contribution in the reconstruction • • Minimizing Forcing to reconstruct z using less number of classes. • Flexible: can still use small share of other classes (if required). • Confident toward one class. Confident Dictionary Learning 21
  • 22. Confident Dictionary Learning • Convexity? • Not convex! 22
  • 23. Confident Dictionary Learning • Convexity? • Not convex! • β: -{most negative eigenvalue of V} • Once before the training! 23
  • 24. Confident Dictionary Learning • Consistent? • Recall uses L too. • Recall objective term similar to the train’s • Flexible contributions • Confidence criteria • More consistent 24
  • 25. Outlines • Introduction • Confident Dictionary Learning • Experiments • Conclusion 25
  • 26. Experiments • Datasets: • Multi-dimensional time-series • Cricket Umpire [1]: • Articulatory Words [2] • Schunk Dexterous [3]: • UTKinect Actions [4]: • DynTex++[5]: [1] M. H. Ko, et al. G. W. West, S. Venkatesh, and M. Kumar, “Online context recognition in multisensor systems using dynamic time warping,” in ISSNIP’05. IEEE, 2005, pp. 283–288. [2] J. Wang, A. Samal, and J. Green, “Preliminary test of a real-time, interactive silent speech interface based on electromagnetic articulograph,” in SLPAT’14, 2014, pp. 38–45. [3] A. Drimus, G. Kootstra, A. Bilberg, and D. Kragic, “Design of a flexible tactile sensor for classification of rigid and deformable objects,” Robotics and Autonomous Systems, vol. 62, no. 1, pp. 3–15, 2014. [4] M. Madry, L. Bo, D. Kragic, and D. Fox, “St-hmp: Unsupervised spatiotemporal feature learning for tactile data,” in ICRA’14. IEEE, 2014, pp. 2262–2269. [5] L. Xia, C.-C. Chen, and J. Aggarwal, “View invariant human action recognition using histograms of 3d joints,” in CVPRW’12 Workshops. IEEE, 2012, pp. 20–27. [6] B. Ghanem and N. Ahuja, “Maximum margin distance learning for dynamic texture recognition,” in ECCV’10. Springer, 2010, 26
  • 27. Experiments • Baselines: • K-KSVD: Kernel K-SVD • LC-NNKSC: Predecessor of CKSC • EKDL: Equiangular kernel dictionary learning • KGDL: Dictionary learning on grassmann manifolds • LP-KSVD: Locality preserving K-SVD 27
  • 28. Experiments • Baselines: • K-KSVD: Kernel K-SVD • LC-NNKSC: Predecessor of CKSC • EKDL: Equiangular kernel dictionary learning • KGDL: Dictionary learning on grassmann manifolds • LP-KSVD: Locality preserving K-SVD • Classification Accuracy (%): • “Dictionary discrimination power” 28
  • 29. Experiments • Interpretability of atoms (IP): • Dictionary: • • In range [1/c 1] • atom i • 1 if atom i belongs only to one class • 1/c if atom i is related to all classes 29
  • 30. Experiments • Interpretability of atoms (IP): • • In range [1/c 1] • 1 if atom i belongs only to one class • 1/c if atom i is related to all classes • Good discrimination • Good interpretation 30
  • 31. Conclusion • Consistency is important for DDL models. • Proposed a flexible discriminant terms for DDL. • Proposed a more consistent training-recall framework. • Increase in: • Discriminative performance • Interpretability of dictionary atoms. 31
  • 32. Thank you very much! 3232