Bayesian grouped horseshoe regression with application to additive models

Zemei Xu, Daniel F. Schmidt, Enes Makalic, Guoqi Qian, John L. Hopper

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review


The Bayesian horseshoe estimator is known for its robustness when handling noisy and sparse big data problems. This paper presents two extensions of the regular Bayesian horseshoe: (i) the grouped Bayesian horseshoe and (ii) the hierarchical Bayesian grouped horseshoe. The advantages of the proposed methods are their flexibility in handling grouped variables through extra shrinkage parameters at the group and within-group levels. We apply the proposed methods to the important class of additive models where group structures naturally exist, and we demonstrate that the grouped hierarchical Bayesian horseshoe has promising performance on both simulated and real data.

Original languageEnglish
Title of host publicationAI 2016: Advances in Artificial Intelligence
Subtitle of host publication29th Australasian Joint Conference Hobart, TAS, Australia, December 5–8, 2016 Proceedings
EditorsByeong Ho Kang, Quan Bai
Place of PublicationCham Switzerland
Number of pages12
ISBN (Electronic)9783319501277
ISBN (Print)9783319501260
Publication statusPublished - 2016
Externally publishedYes
EventAustralasian Joint Conference on Artificial Intelligence 2016 - Hobart, Australia
Duration: 5 Dec 20168 Dec 2016
Conference number: 29th (Proceedings)

Publication series

NameLecture Notes in Computer Science
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


ConferenceAustralasian Joint Conference on Artificial Intelligence 2016
Abbreviated titleAI 2016
Internet address


  • Additive models
  • Bayesian regression
  • Grouped variables
  • Horseshoe

Cite this