Nonparametric budgeted stochastic gradient descent

Trung Le, Vu Nguyen, Tu Dinh Nguyen, Dinh Phung

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

20 Citations (Scopus)

Abstract

One of the most challenging problems in kernel online learning is to bound the model size. Budgeted kernel online learning addresses this issue by bounding the model size to a predefined budget. However, determining an appropriate value for such predefined budget is arduous. In this paper, we propose the Nonparametric Budgeted Stochastic Gradient Descent that allows the model size to automatically grow with data in a principled way. We provide theoretical analysis to show that our framework is guaranteed to converge for a large collection of loss functions (e.g. Hinge, Logistic, L2, L1, and \varepsilon-insensitive) which enables the proposed algorithm to perform both classification and regression tasks without hurting the ideal convergence rate O\left(\frac1T\right) of the standard Stochastic Gradient Descent. We validate our algorithm on the real-world datasets to consolidate the theoretical claims.
Original languageEnglish
Title of host publicationProceedings of Machine Learning Research
Subtitle of host publicationArtificial Intelligence and Statistics, 9-11 May 2016, Cadiz, Spain
EditorsArthur Gretton, Christian C. Robert
Place of PublicationUSA
PublisherProceedings of Machine Learning Research (PMLR)
Pages564-572
Number of pages9
Volume51
Publication statusPublished - 2016
Externally publishedYes
EventInternational Conference on Artificial Intelligence and Statistics 2016 - Cadiz, Spain
Duration: 9 May 201611 May 2016
Conference number: 19th
http://proceedings.mlr.press/v51/ (Proceedings)

Publication series

NameProceedings of Machine Learning Research
ISSN (Print)1938-7228

Conference

ConferenceInternational Conference on Artificial Intelligence and Statistics 2016
Abbreviated titleAISTATS 2016
Country/TerritorySpain
CityCadiz
Period9/05/1611/05/16
Internet address

Cite this