Making deep neural networks robust to label noise: a loss correction approach

Giorgio Patrini, Alessandro Rozza, Aditya Krishna Menon, Richard Nock, Lizhen Qu

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

409 Citations (Scopus)


We present a theoretically grounded approach to train deep neural networks, including recurrent networks, subject to class-dependent label noise. We propose two procedures for loss correction that are agnostic to both application domain and network architecture. They simply amount to at most a matrix inversion and multiplication, provided that we know the probability of each class being corrupted into another. We further show how one can estimate these probabilities, adapting a recent technique for noise estimation to the multi-class setting, and thus providing an end-to-end framework. Extensive experiments on MNIST, IMDB, CIFAR-10, CIFAR-100 and a large scale dataset of clothing images employing a diversity of architectures - stacking dense, convolutional, pooling, dropout, batch normalization, word embedding, LSTM and residual layers - demonstrate the noise robustness of our proposals. Incidentally, we also prove that, when ReLU is the only non-linearity, the loss curvature is immune to class-dependent label noise.

Original languageEnglish
Title of host publicationProceedings - 30th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2017
EditorsJim Rehg, Yanxi Liu, Ying Wu, Camillo Taylor
Place of PublicationPiscataway NJ USA
PublisherIEEE, Institute of Electrical and Electronics Engineers
Number of pages9
ISBN (Electronic)9781538604571
ISBN (Print)9781538604588
Publication statusPublished - 2017
Externally publishedYes
EventIEEE Conference on Computer Vision and Pattern Recognition 2017 - Honolulu, United States of America
Duration: 21 Jul 201726 Jul 2017 (Proceedings)


ConferenceIEEE Conference on Computer Vision and Pattern Recognition 2017
Abbreviated titleCVPR 2017
Country/TerritoryUnited States of America
Internet address

Cite this