A tale of two metrics: Simultaneous bounds on competitiveness and regret

Lachlan Leicester Henry Andrew, Siddharth Barman, Katrina Ligett, Minghong Lin, Adam Meyerson, Alan Roytman, Adam Wierman

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

19 Citations (Scopus)

Abstract

We consider algorithms for “smoothed online convex optimization” problems, a variant of the class of online convex optimization problems that is strongly related to metrical task systems. Prior literature on these problems has focused on two performance metrics: regret and the competitive ratio. There exist known algorithms with sublinear regret and known algorithms with constant competitive ratios; however, no known algorithm achieves both simultaneously. We show that this is due to a fundamental incompatibility between these two metrics – no algorithm (deterministic or randomized) can achieve sublinear regret and a constant competitive ratio, even in the case when the objective functions are linear. However, we also exhibit an algorithm that, for the important special case of one dimensional decision spaces, provides sublinear regret while maintaining a competitive ratio that grows arbitrarily slowly.
Original languageEnglish
Title of host publicationConference on Learning and Theory (COLT 2013)
EditorsShai Shalev-Shwartz, Ingo Steinwart
Place of PublicationUSA
PublisherJournal of Machine Learning Research (JMLR)
Pages741 - 763
Number of pages23
Volume30
ISBN (Electronic)1533-7928
Publication statusPublished - 2013
Externally publishedYes
EventAnnual Conference on Computational Learning Theory - Princeton, United States of America
Duration: 1 Jan 2013 → …

Conference

ConferenceAnnual Conference on Computational Learning Theory
CountryUnited States of America
CityPrinceton
Period1/01/13 → …

Cite this