SlideShare a Scribd company logo
The Majority Can Help the Minority:
Context-rich Minority Oversampling for Long-tailed Classification
1Seoul National University, 2NAVER AI Lab
Poster ID: 159a & Poster Time: 22, Jun. 10:00-12:30
Seulki Park1 Byeongho Heo2 Sangdoo Yun2 Jin Young Choi1
Youngkyu Hong2
Long-tailed Classification
2
Introduction Proposed Method Experiment Conclusion
Many real-world data often exhibit long-tailed distribution.
✓ The model trained on such imbalanced data tends to overfit the majority classes.
✓ That is, the model performs poorly on minority classes.
Problem Definition:
● Input: Long-tailed (imbalanced) training data & uniform-distributed (balanced) test data.
● Goal: To make a robust model that can generalize well on balanced test data.
The Majority Can Help the Minority: Context-rich Minority Oversampling for Long-tailed Classification
Faces (Zhang et al., 2017) Places (Wang et al., 2017) Species (Van Horn et al., 2018) Actions (Zhang et al., 2019)
* Images by authors.
Previous Oversampling Methods
Introduction Proposed Method Experiment Conclusion
1. Random Oversampling (ROS)
◦ A simple and straightforward method which repeatedly oversample minor classes.
◦ However, this may intensify overfitting problem [1].
2. Synthetic Minority Over-sampling Technique (SMOTE), 2002
◦ Oversamples minority samples by interpolating between existing minority samples and their nearest minority neighbors.
◦ However, difficulties for end-to-end algorithm and largescale image datasets due to the high computational complexity of
calculating K-Nearest Neighbor for every sample.
3
The Majority Can Help the Minority: Context-rich Minority Oversampling for Long-tailed Classification
[1] Deep imbalanced attribute classification using visual attention aggregation, ECCV, 2018.
SMOTE Figure from: http://www.incodom.kr/SMOTE
Previous Oversampling Methods
Introduction Proposed Method Experiment Conclusion
3. Generative Adversarial Minority Oversampling (GAMO), 2019
◦ Produces new minority samples by training a convex generator, inspired by the success of
generative adversarial networks (GANs).
◦ However, difficult to train a generator (mode collapse) & additional training cost.
4. MetaSAug: Meta Semantic Augmentation for Long-Tailed Visual Recognition, 2021
◦ Uses implicit semantic data augmentation (ISDA) algorithm [1].
◦ However, this meta-learning-based method requires additional balanced validation, and hundreds and thousands of
iterations for training (high training cost).
4
The Majority Can Help the Minority: Context-rich Minority Oversampling for Long-tailed Classification
[1] Implicit semantic data augmentation for deep networks, NeurIPS, 2019.
Previous Oversampling Methods
Introduction Proposed Method Experiment Conclusion
Random Oversampling
◦ A simple and straightforward method which repeatedly oversample minor classes.
◦ However, this may intensify overfitting problem [1].
Synthetic Minority Over-sampling Technique (SMOTE), 2002
◦ Oversamples minority samples by interpolating between existing minority samples and their nearest minority neighbors.
◦ However, difficulties for end-to-end algorithm and largescale image datasets due to the high computational complexity of
calculating K-Nearest Neighbor for every sample.
Generative Adversarial Minority Oversampling (GAMO), 2019
◦ Produces new minority samples by training a convex generator, inspired by the success of generative adversarial networks
(GANs).
◦ However, difficult to train a generator (mode collapse) & additional training cost.
MetaSAug: Meta Semantic Augmentation for Long-Tailed Visual Recognition, 2021
◦ Uses implicit semantic data augmentation (ISDA) algorithm [1].
◦ However, this meta-learning-based method requires additional balanced validation, and hundreds and thousands of
iterations for training (high training cost).
5
The Majority Can Help the Minority: Context-rich Minority Oversampling for Long-tailed Classification
Limitation of previous methods:
1) Simple methods generate only context-limited images. (e.g, ROS, SMOTE)
- Limited improvement especially when imbalance is severe.
2) Recent methods require additional expensive training cost (e.g, GAMO, MetaSAug)
- E.g., training generator, additional balanced validation set, longer training epochs.
→ We need ‘Simple & Context-rich’ oversampling method!
Motivation
Introduction Proposed Method Experiment Conclusion
Q. How can we generate diverse ‘context-rich minority samples’ from long-tailed distribution?
A. Let’s pay attention to the characteristics of long-tailed distributions.
Key Observations:
✓ Majority class samples are data-rich and information-rich!
→ Let’s use the affluent information of the majority samples
to generate new minority samples.
Key Idea
- We can use the rich major-class images as the background
for the newly created minor-class images.
6
The Majority Can Help the Minority: Context-rich Minority Oversampling for Long-tailed Classification
Proposed Method: Context-rich Minority Oversampling (CMO)
Introduction Proposed Method Experiment Conclusion
Recap: CutMix (Yun et al., 2019)
- A simple but effective data augmentation method used in many visual tasks.
෤
𝑥 = 𝑴⨀𝑥𝑏
+ 𝟏 − 𝑴 ⨀𝑥𝑓
, ෤
𝑦 = 𝜆𝑦𝑏
+ 1 − 𝜆 𝑦𝑓
𝑥𝑏
, 𝑦𝑏
, 𝑥𝑓
, 𝑦𝑓
~ 𝑃
𝑴 ∈ 0, 1 𝑊×𝐻
: a binary mask
→ designed for a class balanced dataset.
Naively using CutMix generates more samples of the majority classes.
Context-rich Minority Oversampling (CMO)
- For an imbalanced dataset, we use different distributions for background and foreground images.
𝑥𝑏
, 𝑦𝑏
~ 𝑃, 𝑥𝑓
, 𝑦𝑓
~ 𝑄
𝑄 : minor-class-weighted distribution.
7
The Majority Can Help the Minority: Context-rich Minority Oversampling for Long-tailed Classification
CutMix
Comparison with CutMix [3]
Proposed Method: Minor-class-weighted distribution
Introduction Proposed Method Experiment Conclusion
How to design minor-class-weighted sampling strategies?
- Re-weighting methods have provided a way how to assign appropriate weights to samples.
- Commonly used sampling strategies give a weight inversely proportional to class frequency [1, 2],
or the effective number [3].
- 𝑛𝑘: the number of samples in 𝑘-th class, 𝐶: the total number of classes.
- The generalized sampling probability for 𝑘-th class can be defined by
𝑞 𝑟, 𝑘 =
1/𝑛𝑘
𝑟
σ𝑘′=1
𝐶
1/𝑛𝑘′
𝑟
- As 𝑟 increases, weight of the minor class becomes increasingly larger than
that of the major class.
- Effective number[3] is defined as
𝐸 𝑘 =
1 − 𝛽𝑛𝑘
1 − 𝛽
8
The Majority Can Help the Minority: Context-rich Minority Oversampling for Long-tailed Classification
[1] Learning deep representation for imbalanced classification, CVPR, 2016.
[2] Exploring the limits of weakly supervised pretraining, ECCV, 2018.
[3] Class-balanced loss based on effective number of samples, CVPR, 2019.
𝑟 = 1
𝑟 = 2
Original
𝑟 = 0
Data distribution
Proposed Method: Algorithm
Introduction Proposed Method Experiment Conclusion
Algorithm
9
The Majority Can Help the Minority: Context-rich Minority Oversampling for Long-tailed Classification
Experimental Results
Introduction Proposed Method Experiment Conclusion
1. Datasets
◦Synthetic data:
▪CIFAR-100-LT (100 classes), ImageNet-LT (1,000 classes)
◦Real-world data:
▪iNaturalist 2018 (8,142 classes)
※ imbalance ratio: the ratio between the most frequent class and the least frequent class.
2. Evaluation metrics
◦Top-1 accuracy
◦Accuracy for disjoint sets (Many > 100, 20<=Med<=100, Few<20) [1]
10
The Majority Can Help the Minority: Context-rich Minority Oversampling for Long-tailed Classification
[1] Large-scale long-tailed recognition in an open world, CVPR, 2019.
Long-tailed classification benchmarks (ImageNet-LT)
Introduction Proposed Method Experiment Conclusion
1. Comparison with state-of-the-arts
11
The Majority Can Help the Minority: Context-rich Minority Oversampling for Long-tailed Classification
2. Comparison with oversampling methods
3. Results of longer training epochs
Analysis
Introduction Proposed Method Experiment Conclusion
1. Impact of different Q distributions
12
The Majority Can Help the Minority: Context-rich Minority Oversampling for Long-tailed Classification
2. Using different augmentation methods
3. Variants of CMO 4. Generated Images
Conclusion
Introduction Proposed Method Experiment Conclusion
✓We propose a novel context-rich minority oversampling that leverages the rich context of the majority
classes as background images.
✓It requires little additional computational cost and can be easily integrated into existing methods.
✓It is simple but effective that achieves the state-of-the-art performance.
✓We empirically prove the effectiveness of the proposed oversampling method through extensive
experiments and ablation studies.
13
The Majority Can Help the Minority: Context-rich Minority Oversampling for Long-tailed Classification
Conclusion
Introduction Proposed Method Experiment Conclusion
✓We propose a novel context-rich minority oversampling that leverages the rich context of the majority
classes as background images.
✓It requires little additional computational cost and can be easily integrated into existing methods.
✓It is simple but effective that achieves the state-of-the-art performance.
✓We empirically prove the effectiveness of the proposed oversampling method through extensive
experiments and ablation studies.
14
The Majority Can Help the Minority: Context-rich Minority Oversampling for Long-tailed Classification
Thank you!
Contact: seulki.park@snu.ac.kr
Code: https://github.com/naver-ai/cmo

More Related Content

PDF
CUDA_ICLR_2023.pdf
PDF
강화학습의 개요
PDF
순환신경망(Recurrent neural networks) 개요
PPTX
effect of learning rate
PPTX
Introduction to Machine learning ppt
PDF
Deep Generative Models
PDF
Style gan2 review
PDF
ResNet basics (Deep Residual Network for Image Recognition)
CUDA_ICLR_2023.pdf
강화학습의 개요
순환신경망(Recurrent neural networks) 개요
effect of learning rate
Introduction to Machine learning ppt
Deep Generative Models
Style gan2 review
ResNet basics (Deep Residual Network for Image Recognition)

What's hot (14)

PDF
Graph Convolutional Network
PPTX
ML + 주식 phase 2
PDF
Continuous Control with Deep Reinforcement Learning, lillicrap et al, 2015
PPT
IEC61850_revisao_julho_2009.ppt
PDF
Deep learning (Machine learning) tutorial for beginners
PDF
강화학습 알고리즘의 흐름도 Part 2
PDF
Deep Learning for Computer Vision: Data Augmentation (UPC 2016)
PDF
알파고 해부하기 1부
PDF
파이썬으로 나만의 강화학습 환경 만들기
PPTX
Clustering - K-Means, DBSCAN
PDF
File3 pe
DOCX
Lab 2 kirchhoffs voltage and current laws by kehali bekele haileselassie
PPTX
Graph Convolutional Network
ML + 주식 phase 2
Continuous Control with Deep Reinforcement Learning, lillicrap et al, 2015
IEC61850_revisao_julho_2009.ppt
Deep learning (Machine learning) tutorial for beginners
강화학습 알고리즘의 흐름도 Part 2
Deep Learning for Computer Vision: Data Augmentation (UPC 2016)
알파고 해부하기 1부
파이썬으로 나만의 강화학습 환경 만들기
Clustering - K-Means, DBSCAN
File3 pe
Lab 2 kirchhoffs voltage and current laws by kehali bekele haileselassie
Ad

Similar to [CVPR 22] Context-rich Minority Oversampling for Long-tailed Classification (20)

PDF
IMBALANCED DATA LEARNING APPROACHES REVIEW
PDF
Multi-Cluster Based Approach for skewed Data in Data Mining
PDF
PROBABILITY BASED CLUSTER EXPANSION OVERSAMPLING TECHNIQUE FOR IMBALANCED DATA
PDF
6145-Article Text-9370-1-10-20200513.pdf
PDF
When deep learners change their mind learning dynamics for active learning
PDF
Analysis of Imbalanced Classification Algorithms A Perspective View
PPTX
Lecture 6: Ensemble Methods
PPT
Learning On The Border:Active Learning in Imbalanced classification Data
PPTX
AIML UNIT 4.pptx. IT contains syllabus and full subject
PPTX
Ensemble learning
PPTX
(Machine Learning) Ensemble learning
PDF
An overview on data mining designed for imbalanced datasets
PDF
An overview on data mining designed for imbalanced datasets
PDF
Multilevel techniques for the clustering problem
PDF
METHODS FOR INCREMENTAL LEARNING: A SURVEY
PDF
Addressing overfitting in comparative study for deep learningbased classifica...
DOCX
dl unit 4.docx for deep learning in b tech
PDF
An approach for improved students’ performance prediction using homogeneous ...
IMBALANCED DATA LEARNING APPROACHES REVIEW
Multi-Cluster Based Approach for skewed Data in Data Mining
PROBABILITY BASED CLUSTER EXPANSION OVERSAMPLING TECHNIQUE FOR IMBALANCED DATA
6145-Article Text-9370-1-10-20200513.pdf
When deep learners change their mind learning dynamics for active learning
Analysis of Imbalanced Classification Algorithms A Perspective View
Lecture 6: Ensemble Methods
Learning On The Border:Active Learning in Imbalanced classification Data
AIML UNIT 4.pptx. IT contains syllabus and full subject
Ensemble learning
(Machine Learning) Ensemble learning
An overview on data mining designed for imbalanced datasets
An overview on data mining designed for imbalanced datasets
Multilevel techniques for the clustering problem
METHODS FOR INCREMENTAL LEARNING: A SURVEY
Addressing overfitting in comparative study for deep learningbased classifica...
dl unit 4.docx for deep learning in b tech
An approach for improved students’ performance prediction using homogeneous ...
Ad

Recently uploaded (20)

PDF
Dropbox Q2 2025 Financial Results & Investor Presentation
PDF
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
PDF
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
PDF
Diabetes mellitus diagnosis method based random forest with bat algorithm
PPTX
Cloud computing and distributed systems.
PPTX
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx
PDF
gpt5_lecture_notes_comprehensive_20250812015547.pdf
PDF
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
PPTX
A Presentation on Artificial Intelligence
PDF
Encapsulation theory and applications.pdf
PDF
The Rise and Fall of 3GPP – Time for a Sabbatical?
PDF
Mobile App Security Testing_ A Comprehensive Guide.pdf
DOCX
The AUB Centre for AI in Media Proposal.docx
PPT
“AI and Expert System Decision Support & Business Intelligence Systems”
PPTX
Spectroscopy.pptx food analysis technology
PDF
Profit Center Accounting in SAP S/4HANA, S4F28 Col11
PDF
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
PPTX
20250228 LYD VKU AI Blended-Learning.pptx
PDF
MIND Revenue Release Quarter 2 2025 Press Release
PDF
Approach and Philosophy of On baking technology
Dropbox Q2 2025 Financial Results & Investor Presentation
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
Diabetes mellitus diagnosis method based random forest with bat algorithm
Cloud computing and distributed systems.
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx
gpt5_lecture_notes_comprehensive_20250812015547.pdf
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
A Presentation on Artificial Intelligence
Encapsulation theory and applications.pdf
The Rise and Fall of 3GPP – Time for a Sabbatical?
Mobile App Security Testing_ A Comprehensive Guide.pdf
The AUB Centre for AI in Media Proposal.docx
“AI and Expert System Decision Support & Business Intelligence Systems”
Spectroscopy.pptx food analysis technology
Profit Center Accounting in SAP S/4HANA, S4F28 Col11
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
20250228 LYD VKU AI Blended-Learning.pptx
MIND Revenue Release Quarter 2 2025 Press Release
Approach and Philosophy of On baking technology

[CVPR 22] Context-rich Minority Oversampling for Long-tailed Classification

  • 1. The Majority Can Help the Minority: Context-rich Minority Oversampling for Long-tailed Classification 1Seoul National University, 2NAVER AI Lab Poster ID: 159a & Poster Time: 22, Jun. 10:00-12:30 Seulki Park1 Byeongho Heo2 Sangdoo Yun2 Jin Young Choi1 Youngkyu Hong2
  • 2. Long-tailed Classification 2 Introduction Proposed Method Experiment Conclusion Many real-world data often exhibit long-tailed distribution. ✓ The model trained on such imbalanced data tends to overfit the majority classes. ✓ That is, the model performs poorly on minority classes. Problem Definition: ● Input: Long-tailed (imbalanced) training data & uniform-distributed (balanced) test data. ● Goal: To make a robust model that can generalize well on balanced test data. The Majority Can Help the Minority: Context-rich Minority Oversampling for Long-tailed Classification Faces (Zhang et al., 2017) Places (Wang et al., 2017) Species (Van Horn et al., 2018) Actions (Zhang et al., 2019) * Images by authors.
  • 3. Previous Oversampling Methods Introduction Proposed Method Experiment Conclusion 1. Random Oversampling (ROS) ◦ A simple and straightforward method which repeatedly oversample minor classes. ◦ However, this may intensify overfitting problem [1]. 2. Synthetic Minority Over-sampling Technique (SMOTE), 2002 ◦ Oversamples minority samples by interpolating between existing minority samples and their nearest minority neighbors. ◦ However, difficulties for end-to-end algorithm and largescale image datasets due to the high computational complexity of calculating K-Nearest Neighbor for every sample. 3 The Majority Can Help the Minority: Context-rich Minority Oversampling for Long-tailed Classification [1] Deep imbalanced attribute classification using visual attention aggregation, ECCV, 2018. SMOTE Figure from: http://www.incodom.kr/SMOTE
  • 4. Previous Oversampling Methods Introduction Proposed Method Experiment Conclusion 3. Generative Adversarial Minority Oversampling (GAMO), 2019 ◦ Produces new minority samples by training a convex generator, inspired by the success of generative adversarial networks (GANs). ◦ However, difficult to train a generator (mode collapse) & additional training cost. 4. MetaSAug: Meta Semantic Augmentation for Long-Tailed Visual Recognition, 2021 ◦ Uses implicit semantic data augmentation (ISDA) algorithm [1]. ◦ However, this meta-learning-based method requires additional balanced validation, and hundreds and thousands of iterations for training (high training cost). 4 The Majority Can Help the Minority: Context-rich Minority Oversampling for Long-tailed Classification [1] Implicit semantic data augmentation for deep networks, NeurIPS, 2019.
  • 5. Previous Oversampling Methods Introduction Proposed Method Experiment Conclusion Random Oversampling ◦ A simple and straightforward method which repeatedly oversample minor classes. ◦ However, this may intensify overfitting problem [1]. Synthetic Minority Over-sampling Technique (SMOTE), 2002 ◦ Oversamples minority samples by interpolating between existing minority samples and their nearest minority neighbors. ◦ However, difficulties for end-to-end algorithm and largescale image datasets due to the high computational complexity of calculating K-Nearest Neighbor for every sample. Generative Adversarial Minority Oversampling (GAMO), 2019 ◦ Produces new minority samples by training a convex generator, inspired by the success of generative adversarial networks (GANs). ◦ However, difficult to train a generator (mode collapse) & additional training cost. MetaSAug: Meta Semantic Augmentation for Long-Tailed Visual Recognition, 2021 ◦ Uses implicit semantic data augmentation (ISDA) algorithm [1]. ◦ However, this meta-learning-based method requires additional balanced validation, and hundreds and thousands of iterations for training (high training cost). 5 The Majority Can Help the Minority: Context-rich Minority Oversampling for Long-tailed Classification Limitation of previous methods: 1) Simple methods generate only context-limited images. (e.g, ROS, SMOTE) - Limited improvement especially when imbalance is severe. 2) Recent methods require additional expensive training cost (e.g, GAMO, MetaSAug) - E.g., training generator, additional balanced validation set, longer training epochs. → We need ‘Simple & Context-rich’ oversampling method!
  • 6. Motivation Introduction Proposed Method Experiment Conclusion Q. How can we generate diverse ‘context-rich minority samples’ from long-tailed distribution? A. Let’s pay attention to the characteristics of long-tailed distributions. Key Observations: ✓ Majority class samples are data-rich and information-rich! → Let’s use the affluent information of the majority samples to generate new minority samples. Key Idea - We can use the rich major-class images as the background for the newly created minor-class images. 6 The Majority Can Help the Minority: Context-rich Minority Oversampling for Long-tailed Classification
  • 7. Proposed Method: Context-rich Minority Oversampling (CMO) Introduction Proposed Method Experiment Conclusion Recap: CutMix (Yun et al., 2019) - A simple but effective data augmentation method used in many visual tasks. ෤ 𝑥 = 𝑴⨀𝑥𝑏 + 𝟏 − 𝑴 ⨀𝑥𝑓 , ෤ 𝑦 = 𝜆𝑦𝑏 + 1 − 𝜆 𝑦𝑓 𝑥𝑏 , 𝑦𝑏 , 𝑥𝑓 , 𝑦𝑓 ~ 𝑃 𝑴 ∈ 0, 1 𝑊×𝐻 : a binary mask → designed for a class balanced dataset. Naively using CutMix generates more samples of the majority classes. Context-rich Minority Oversampling (CMO) - For an imbalanced dataset, we use different distributions for background and foreground images. 𝑥𝑏 , 𝑦𝑏 ~ 𝑃, 𝑥𝑓 , 𝑦𝑓 ~ 𝑄 𝑄 : minor-class-weighted distribution. 7 The Majority Can Help the Minority: Context-rich Minority Oversampling for Long-tailed Classification CutMix Comparison with CutMix [3]
  • 8. Proposed Method: Minor-class-weighted distribution Introduction Proposed Method Experiment Conclusion How to design minor-class-weighted sampling strategies? - Re-weighting methods have provided a way how to assign appropriate weights to samples. - Commonly used sampling strategies give a weight inversely proportional to class frequency [1, 2], or the effective number [3]. - 𝑛𝑘: the number of samples in 𝑘-th class, 𝐶: the total number of classes. - The generalized sampling probability for 𝑘-th class can be defined by 𝑞 𝑟, 𝑘 = 1/𝑛𝑘 𝑟 σ𝑘′=1 𝐶 1/𝑛𝑘′ 𝑟 - As 𝑟 increases, weight of the minor class becomes increasingly larger than that of the major class. - Effective number[3] is defined as 𝐸 𝑘 = 1 − 𝛽𝑛𝑘 1 − 𝛽 8 The Majority Can Help the Minority: Context-rich Minority Oversampling for Long-tailed Classification [1] Learning deep representation for imbalanced classification, CVPR, 2016. [2] Exploring the limits of weakly supervised pretraining, ECCV, 2018. [3] Class-balanced loss based on effective number of samples, CVPR, 2019. 𝑟 = 1 𝑟 = 2 Original 𝑟 = 0 Data distribution
  • 9. Proposed Method: Algorithm Introduction Proposed Method Experiment Conclusion Algorithm 9 The Majority Can Help the Minority: Context-rich Minority Oversampling for Long-tailed Classification
  • 10. Experimental Results Introduction Proposed Method Experiment Conclusion 1. Datasets ◦Synthetic data: ▪CIFAR-100-LT (100 classes), ImageNet-LT (1,000 classes) ◦Real-world data: ▪iNaturalist 2018 (8,142 classes) ※ imbalance ratio: the ratio between the most frequent class and the least frequent class. 2. Evaluation metrics ◦Top-1 accuracy ◦Accuracy for disjoint sets (Many > 100, 20<=Med<=100, Few<20) [1] 10 The Majority Can Help the Minority: Context-rich Minority Oversampling for Long-tailed Classification [1] Large-scale long-tailed recognition in an open world, CVPR, 2019.
  • 11. Long-tailed classification benchmarks (ImageNet-LT) Introduction Proposed Method Experiment Conclusion 1. Comparison with state-of-the-arts 11 The Majority Can Help the Minority: Context-rich Minority Oversampling for Long-tailed Classification 2. Comparison with oversampling methods 3. Results of longer training epochs
  • 12. Analysis Introduction Proposed Method Experiment Conclusion 1. Impact of different Q distributions 12 The Majority Can Help the Minority: Context-rich Minority Oversampling for Long-tailed Classification 2. Using different augmentation methods 3. Variants of CMO 4. Generated Images
  • 13. Conclusion Introduction Proposed Method Experiment Conclusion ✓We propose a novel context-rich minority oversampling that leverages the rich context of the majority classes as background images. ✓It requires little additional computational cost and can be easily integrated into existing methods. ✓It is simple but effective that achieves the state-of-the-art performance. ✓We empirically prove the effectiveness of the proposed oversampling method through extensive experiments and ablation studies. 13 The Majority Can Help the Minority: Context-rich Minority Oversampling for Long-tailed Classification
  • 14. Conclusion Introduction Proposed Method Experiment Conclusion ✓We propose a novel context-rich minority oversampling that leverages the rich context of the majority classes as background images. ✓It requires little additional computational cost and can be easily integrated into existing methods. ✓It is simple but effective that achieves the state-of-the-art performance. ✓We empirically prove the effectiveness of the proposed oversampling method through extensive experiments and ablation studies. 14 The Majority Can Help the Minority: Context-rich Minority Oversampling for Long-tailed Classification Thank you! Contact: seulki.park@snu.ac.kr Code: https://github.com/naver-ai/cmo