Logistic regression with the nonnegative garrote

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

3 Citations (Scopus)

Abstract

Logistic regression is one of the most commonly applied statistical methods for binary classification problems. This paper considers the nonnegative garrote regularization penalty in logistic models and derives an optimization algorithm for minimizing the resultant penalty function. The search algorithm is computationally efficient and can be used even when the number of regressors is much larger than the number of samples. As the nonnegative garrote requires an initial estimate of the parameters, a number of possible estimators are compared and contrasted. Logistic regression with the nonnegative garrote is then compared with several popular regularization methods in a set of comprehensive numerical simulations. The proposed method attained excellent performance in terms of prediction rate and variable selection accuracy on both real and artificially generated data.

Original languageEnglish
Title of host publicationAI 2011
Subtitle of host publicationAdvances in Artificial Intelligence - 24th Australasian Joint Conference, Proceedings
PublisherSpringer
Pages82-91
Number of pages10
ISBN (Print)9783642258312
DOIs
Publication statusPublished - 2011
Externally publishedYes
EventAustralasian Joint Conference on Artificial Intelligence 2011 - Perth, Australia
Duration: 5 Dec 20118 Dec 2011
Conference number: 24th
https://link.springer.com/book/10.1007/978-3-642-25832-9 (Proceedings)

Publication series

NameLecture Notes in Computer Science
PublisherSpringer
Volume7106
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

ConferenceAustralasian Joint Conference on Artificial Intelligence 2011
Abbreviated titleAI 2011
Country/TerritoryAustralia
CityPerth
Period5/12/118/12/11
Internet address

Cite this