Abstract
We present a theoretically grounded approach to train deep neural networks, including recurrent networks, subject to class-dependent label noise. We propose two procedures for loss correction that are agnostic to both application domain and network architecture. They simply amount to at most a matrix inversion and multiplication, provided that we know the probability of each class being corrupted into another. We further show how one can estimate these probabilities, adapting a recent technique for noise estimation to the multi-class setting, and thus providing an end-to-end framework. Extensive experiments on MNIST, IMDB, CIFAR-10, CIFAR-100 and a large scale dataset of clothing images employing a diversity of architectures - stacking dense, convolutional, pooling, dropout, batch normalization, word embedding, LSTM and residual layers - demonstrate the noise robustness of our proposals. Incidentally, we also prove that, when ReLU is the only non-linearity, the loss curvature is immune to class-dependent label noise.
Original language | English |
---|---|
Title of host publication | Proceedings - 30th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2017 |
Editors | Jim Rehg, Yanxi Liu, Ying Wu, Camillo Taylor |
Place of Publication | Piscataway NJ USA |
Publisher | IEEE, Institute of Electrical and Electronics Engineers |
Pages | 2233-2241 |
Number of pages | 9 |
ISBN (Electronic) | 9781538604571 |
ISBN (Print) | 9781538604588 |
DOIs | |
Publication status | Published - 2017 |
Externally published | Yes |
Event | IEEE Conference on Computer Vision and Pattern Recognition 2017 - Honolulu, United States of America Duration: 21 Jul 2017 → 26 Jul 2017 http://cvpr2017.thecvf.com/ https://ieeexplore.ieee.org/xpl/conhome/8097368/proceeding (Proceedings) |
Conference
Conference | IEEE Conference on Computer Vision and Pattern Recognition 2017 |
---|---|
Abbreviated title | CVPR 2017 |
Country/Territory | United States of America |
City | Honolulu |
Period | 21/07/17 → 26/07/17 |
Internet address |