A Bregman divergence view on the difference-of-convex algorithm

Oisín Faust, Hamza Fawzi, James Saunderson

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

1 Citation (Scopus)

Abstract

The difference-of-convex (DC) algorithm is a conceptually simple method for the minimization of (non)convex functions that are expressed as the difference of two convex functions. An attractive feature of the algorithm is that it maintains a global overestimator on the function and does not require a choice of step size at each iteration. By adopting a Bregman divergence point of view, we simplify and strengthen many existing non-asymptotic convergence guarantees for the DC algorithm. We further present several sufficient conditions that ensure a linear convergence rate, namely a new DC Polyak-Łojasiewicz condition, as well as a relative strong convexity assumption. Importantly, our conditions do not require smoothness of the objective function. We illustrate our results on a family of minimization problems involving the quantum relative entropy, with applications in quantum information theory.

Original languageEnglish
Title of host publicationProceedings of The 26th International Conference on Artificial Intelligence and Statistics
EditorsFrancisco Ruiz, Jennifer Dy, Jan-Willem van de Meent
Place of PublicationLondon UK
PublisherProceedings of Machine Learning Research (PMLR)
Pages3427-3439
Number of pages13
Volume206
Publication statusPublished - 2023
EventInternational Conference on Artificial Intelligence and Statistics 2023 - Palau de Congressos, Valencia, Spain
Duration: 25 Apr 202327 Apr 2023
Conference number: 26th
https://proceedings.mlr.press/v206/ (Proceedings)
http://aistats.org/aistats2023/ (Website)

Conference

ConferenceInternational Conference on Artificial Intelligence and Statistics 2023
Abbreviated titleAISTATS 2023
Country/TerritorySpain
CityValencia
Period25/04/2327/04/23
Internet address

Cite this