Group selection and shrinkage: structured sparsity for semiparametric additive models

Ryan Thompson, Farshid Vahid

Research output: Contribution to journalArticleResearchpeer-review

Abstract

Sparse regression and classification estimators that respect group structures have application to an assortment of statistical and machine learning problems, from multitask learning to sparse additive modeling to hierarchical selection. This work introduces structured sparse estimators that combine group subset selection with shrinkage. To accommodate sophisticated structures, our estimators allow for arbitrary overlap between groups. We develop an optimization framework for fitting the nonconvex regularization surface and present finite-sample error bounds for estimation of the regression function. As an application requiring structure, we study sparse semiparametric additive modeling, a procedure that allows the effect of each predictor to be zero, linear, or nonlinear. For this task, the new estimators improve across several metrics on synthetic data compared to alternatives. Finally, we demonstrate their efficacy in modeling supermarket foot traffic and economic recessions using many predictors. These demonstrations suggest sparse semiparametric additive models, fit using the new estimators, are an excellent compromise between fully linear and fully nonparametric alternatives. All of our algorithms are made available in the scalable implementation grpsel. Supplementary materials for this article are available online.

Original languageEnglish
Pages (from-to)1286-1297
Number of pages12
JournalJournal of Computational and Graphical Statistics
Volume33
Issue number4
DOIs
Publication statusPublished - 2024

Keywords

  • Group lasso
  • Group sparsity
  • Group subset selection
  • Structured sparsity
  • Variable selection

Cite this