AI in EE

AI IN DIVISIONS

AI in Computer Division

AI in EE

AI IN DIVISIONS

AI in Computer Division ​

AI in Computer Division

Ki-Min Lee (KAIST EE), Suk-Min Yun (KAIST EE), Ki-Bok Lee (CS UMich), Hong-Lak Lee (CS Umich / Google Brain), Bo Li (CS UIUC) and Jin-Woo Shin(KAIST EE) paper accepted at 36th International Conference on Machine Learning (ICML 2019)

Title: Robust Inference via Generative Classifiers for Handling Noisy Labels (ICML 2019)

Authors: Ki-Min Lee (KAIST EE), Suk-Min Yun (KAIST EE), Ki-Bok Lee (CS UMich), Hong-Lak Lee (CS Umich / Google Brain), Bo Li (CS UIUC) and Jin-Woo Shin (KAIST EE)

 Large-scale datasets may contain significant proportions of noisy (incorrect) class labels, and it is well-known that modern deep neural networks (DNNs) poorly generalize from such noisy training datasets. To mitigate the issue, we propose a novel inference method, termed Robust Generative classifier (RoG), applicable to any discriminative (e.g., softmax) neural classifier pre-trained on noisy datasets. In particular, we induce a generative classifier on top of hidden feature spaces of the pre-trained DNNs, for obtaining a more robust decision boundary. By estimating the parameters of generative classifier using the minimum covariance determinant estimator, we significantly improve the classification accuracy with neither re-training of the deep model nor changing its architectures. With the assumption of Gaussian distribution for features, we prove that RoG generalizes better than baselines under noisy labels. Finally, we propose the ensemble version of RoG to improve its performance by investigating the layer-wise characteristics of DNNs. Our extensive experimental results demonstrate the superiority of RoG given different learning models optimized by several training techniques to handle diverse scenarios of noisy labels.

kimin
Figure 1. Visualization of features on the penultimate layer using t-SNE from training samples when the noise fraction is 20%