Dual space gradient descent for online learning

Trung Le, Tu Dinh Nguyen, Vu Nguyen, Dinh Phung

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

13 Citations (Scopus)

Abstract

One crucial goal in kernel online learning is to bound the model size. Common approaches employ budget maintenance procedures to restrict the model sizes using removal, projection, or merging strategies. Although projection and merging, in the literature, are known to be the most effective strategies, they demand extensive computation whilst removal strategy fails to retain information of the removed vectors. An alternative way to address the model size problem is to apply random features to approximate the kernel function. This allows the model to be maintained directly in the random feature space, hence effectively resolve the curse of kernelization. However, this approach still suffers from a serious shortcoming as it needs to use a high dimensional random feature space to achieve a sufficiently accurate kernel approximation. Consequently, it leads to a significant increase in the computational cost. To address all of these aforementioned challenges, we present in this paper the Dual Space Gradient Descent (DualSGD), a novel framework that utilizes random features as an auxiliary space to maintain information from data points removed during budget maintenance. Consequently, our approach permits the budget to be maintained in a simple, direct and elegant way while simultaneously mitigating the impact of the dimensionality issue on learning performance. We further provide convergence analysis and extensively conduct experiments on five real-world datasets to demonstrate the predictive performance and scalability of our proposed method in comparison with the state-of-the-art baselines.

Original languageEnglish
Title of host publicationNIPS 2016 Proceedings
Subtitle of host publicationAdvances in Neural Information Processing Systems (NIPS 2016)
EditorsD.D. Lee, M. Sugiyama, U.V. Luxburg, I. Guyon, R. Garnett
Place of PublicationMaryland Heights MO USA
PublisherMorgan Kaufmann Publishers
Number of pages9
Volume29
Publication statusPublished - 2016
Externally publishedYes
EventAdvances in Neural Information Processing Systems 2016 - Barcelona, Spain
Duration: 5 Dec 201610 Dec 2016
Conference number: 29th
https://dl.acm.org/doi/proceedings/10.5555/3157096 (Proceedings)

Publication series

NameAdvances in Neural Information Processing Systems
ISSN (Print)1049-5258

Conference

ConferenceAdvances in Neural Information Processing Systems 2016
Abbreviated titleNIPS 2016
CountrySpain
CityBarcelona
Period5/12/1610/12/16
Internet address

Cite this

Le, T., Nguyen, T. D., Nguyen, V., & Phung, D. (2016). Dual space gradient descent for online learning. In D. D. Lee, M. Sugiyama, U. V. Luxburg, I. Guyon, & R. Garnett (Eds.), NIPS 2016 Proceedings: Advances in Neural Information Processing Systems (NIPS 2016) (Vol. 29). (Advances in Neural Information Processing Systems). Morgan Kaufmann Publishers.