A polynomial expansion line search for large-scale unconstrained minimization of smooth L2-regularized loss functions, with implementation in Apache Spark

Michael Hynes, Hans De Sterck

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

Abstract

In large-scale unconstrained optimization algorithms such as limited memory BFGS (LBFGS), a common sub-problem is a line search minimizing the loss function along a descent direction. Commonly used line searches iteratively find an approximate solution for which the Wolfe conditions are satisfied, typically requiring multiple function and gradient evaluations per line search, which is expensive in parallel due to communication requirements. In this paper we propose a new line search approach for cases where the loss function is analytic, as in least squares regression, logistic regression, or low rank matrix factorization. We approximate the loss function by a truncated Taylor polynomial, whose coefficients may be computed efficiently in parallel with less communication than evaluating the gradient, after which this polynomial may be minimized with high accuracy in a neighbourhood of the expansion point. The expansion may be repeated iteratively in a line search invocation until the expansion point and minimum are sufficiently accurate. Our Polynomial Expansion Line Search (PELS) was implemented in the Apache Spark framework and used to accelerate the training of a logistic regression model on binary classification datasets from the LIBSVM repository with LBFGS and the Nonlinear Conjugate Gradient (NCG) method. In large-scale numerical experiments in parallel on a 16-node cluster with 256 cores using the URL, KDD-A, and KDD-B datasets, the PELS approach produced significant convergence improvements compared to the use of classical Wolfe approximate line searches. For example, to reach the final training label prediction accuracies, LBFGS using PELS had speedup factors of 1.8-2 over LBFGS using a Wolfe approximate line search, measured by both the number of iterations and the time required, due to the better accuracy of step sizes computed in the line search. PELS has the potential to significantly accelerate widely-used parallel large-scale regression and factorization computations, and is applicable to important classes of continuous optimization problems with smooth loss functions.

Original languageEnglish
Title of host publicationProceedings of the Sixteenth SIAM International Conference on Data Mining 2016
EditorsSanjay Chawla, Wagner Meira Jr.
Place of PublicationPhiladelphia PA USA
PublisherSociety for Industrial and Applied Mathematics SIAM Publications
Pages1-9
Number of pages9
ISBN (Electronic)9781510828117
ISBN (Print)9781611974348
DOIs
Publication statusPublished - 2016
EventSIAM International Conference on Data Mining 2016 - Hilton Miami Downtown, Miami, United States of America
Duration: 5 May 20167 May 2016
Conference number: 16th
http://www.siam.org/meetings/sdm16/

Conference

ConferenceSIAM International Conference on Data Mining 2016
Abbreviated titleSDM 2016
CountryUnited States of America
CityMiami
Period5/05/167/05/16
Internet address

Cite this

Hynes, M., & De Sterck, H. (2016). A polynomial expansion line search for large-scale unconstrained minimization of smooth L2-regularized loss functions, with implementation in Apache Spark. In S. Chawla, & W. Meira Jr. (Eds.), Proceedings of the Sixteenth SIAM International Conference on Data Mining 2016 (pp. 1-9). Philadelphia PA USA: Society for Industrial and Applied Mathematics SIAM Publications. https://doi.org/10.1137/1.9781611974348.68