Training quantized neural networks with a full-precision auxiliary module

Bohan Zhuang, Lingqiao Liu, Mingkui Tan, Chunhua Shen, Ian D. Reid

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review


In this paper, we seek to tackle a challenge in training low-precision networks: the notorious difficulty in propagating gradient through a low-precision network due to the non-differentiable quantization function. We propose a solution by training the low-precision network with a fullprecision auxiliary module. Specifically, during training, we construct a mix-precision network by augmenting the original low-precision network with the full precision auxiliary module. Then the augmented mix-precision network and the low-precision network are jointly optimized. This strategy creates additional full-precision routes to update the parameters of the low-precision model, thus making the gradient back-propagates more easily. At the inference time, we discard the auxiliary module without introducing any computational complexity to the low-precision network. We evaluate the proposed method on image classification and object detection over various quantization approaches and show consistent performance increase. In particular, we achieve near lossless performance to the full-precision model by using a 4-bit detector, which is of great practical value.
Original languageEnglish
Title of host publicationProceedings - 33th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2020
EditorsCe Liu, Greg Mori, Kate Saenko, Silvio Savarese
Place of PublicationPiscataway NJ USA
PublisherIEEE, Institute of Electrical and Electronics Engineers
Number of pages10
ISBN (Electronic)9781728171685
ISBN (Print)9781728171692
Publication statusPublished - 2020
Externally publishedYes
EventIEEE Conference on Computer Vision and Pattern Recognition 2020 - Virtual, China
Duration: 14 Jun 202019 Jun 2020 (Website ) (Proceedings) (Proceedings)


ConferenceIEEE Conference on Computer Vision and Pattern Recognition 2020
Abbreviated titleCVPR 2020
Internet address

Cite this