Robust lasso regression with student-t residuals

Daniel F. Schmidt, Enes Makalic

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearch

1 Citation (Scopus)


The lasso, introduced by Robert Tibshirani in 1996, has become one of the most popular techniques for estimating Gaussian linear regression models. An important reason for this popularity is that the lasso can simultaneously estimate all regression parameters as well as select important variables, yielding accurate regression models that are highly interpretable. This paper derives an efficient procedure for fitting robust linear regression models with the lasso in the case where the residuals are distributed according to a Student-t distribution. In contrast to Gaussian lasso regression, the proposed Student-t lasso regression procedure can be applied to data sets which contain large outlying observations. We demonstrate the utility of our Student-t lasso regression by analysing the Boston housing data set.

Original languageEnglish
Title of host publicationAI 2017
Subtitle of host publicationAdvances in Artificial Intelligence - 30th Australasian Joint Conference, Melbourne, VIC, Australia, August 19-20, 2017, Proceedings
EditorsWei Peng, Damminda Alahakoon, Xiaodong Li
Place of PublicationCham Switzerland
Number of pages10
ISBN (Electronic)9783319630045
ISBN (Print)9783319630038
Publication statusPublished - 2017
Externally publishedYes
EventAustralasian Joint Conference on Artificial Intelligence 2017 - Melbourne, Australia
Duration: 19 Aug 201720 Aug 2017
Conference number: 30th (Proceedings)

Publication series

NameLecture Notes in Artificial Intelligence
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


ConferenceAustralasian Joint Conference on Artificial Intelligence 2017
Abbreviated titleAI 2017
Internet address


  • Expectation-maximisation algorithm
  • Lasso
  • Robust regression

Cite this