Stable feature selection with support vector machines

Iman Kamkar, Sunil Kumar Gupta, Dinh Phung, Svetha Venkatesh

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearch

7 Citations (Scopus)


The support vector machine (SVM) is a popular method for classification, well known for finding the maximum-margin hyperplane. Combining SVM with l 1 -norm penalty further enables it to simultaneously perform feature selection and margin maximization within a single framework. However, l 1 -norm SVM shows instability in selecting features in presence of correlated features. We propose a new method to increase the stability of l 1 -norm SVM by encouraging similarities between feature weights based on feature correlations, which is captured via a feature covariance matrix. Our proposed method can capture both positive and negative correlations between features. We formulate the model as a convex optimization problem and propose a solution based on alternating minimization. Using both synthetic and real-world datasets, we show that our model achieves better stability and classification accuracy compared to several state-of-the-art regularized classification methods.

Original languageEnglish
Title of host publicationAI 2015: Advances in Artificial Intelligence
Subtitle of host publication28th Australasian Joint Conference Canberra, ACT, Australia, November 30 – December 4, 2015 Proceedings
EditorsBernhard Pfahringer, Jochen Renz
Place of PublicationCham Switzerland
Number of pages11
ISBN (Electronic)9783319263502
ISBN (Print)9783319263496
Publication statusPublished - 2015
Externally publishedYes
EventAustralasian Joint Conference on Artificial Intelligence 2015 - Canberra, Australia
Duration: 30 Nov 20154 Dec 2015
Conference number: 28th (Proceedings)

Publication series

NameLecture Notes in Computer Science
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


ConferenceAustralasian Joint Conference on Artificial Intelligence 2015
Abbreviated titleAI 2015
Internet address

Cite this