Abstract
We consider algorithms for “smoothed online convex optimization” problems, a variant of the class of online convex optimization problems that is strongly related to metrical task systems. Prior literature on these problems has focused on two performance metrics: regret and the competitive ratio. There exist known algorithms with sublinear regret and known algorithms with constant competitive ratios; however, no known algorithm achieves both simultaneously. We show that this is due to a fundamental incompatibility between these two metrics – no algorithm (deterministic or randomized) can achieve sublinear regret and a constant competitive ratio, even in the case when the objective functions are linear. However, we also exhibit an algorithm that, for the important special case of one dimensional decision spaces, provides sublinear regret while maintaining a competitive ratio that grows arbitrarily slowly.
Original language | English |
---|---|
Title of host publication | Conference on Learning and Theory (COLT 2013) |
Editors | Shai Shalev-Shwartz, Ingo Steinwart |
Place of Publication | USA |
Publisher | Journal of Machine Learning Research (JMLR) |
Pages | 741 - 763 |
Number of pages | 23 |
Publication status | Published - 2013 |
Externally published | Yes |
Event | Annual Conference on Computational Learning Theory 2013 - Princeton, United States of America Duration: 1 Jan 2013 → … |
Conference
Conference | Annual Conference on Computational Learning Theory 2013 |
---|---|
Country/Territory | United States of America |
City | Princeton |
Period | 1/01/13 → … |