Maximal margin learning vector quantisation

Trung Le, Dat Tran, Van Nguyen, Wanli Ma

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

Abstract

Kernel Generalised Learning Vector Quantisation (KGLVQ) was proposed to extend Generalised Learning Vector Quantisation into the kernel feature space to deal with complex class boundaries and thus yielded promising performance for complex classification tasks in pattern recognition. However KGLVQ does not follow the maximal margin principle, which is crucial for kernel-based learning methods. In this paper we propose a maximal margin approach (MLVQ) to the KGLVQ algorithm. MLVQ inherits the merits of KGLVQ and also follows the maximal margin principle to improve the generalisation capability. Experiments performed on the well-known data sets available in UCI repository show promising classification results for the proposed method.

Original languageEnglish
Title of host publication2013 International Joint Conference on Neural Networks, IJCNN 2013
PublisherIEEE, Institute of Electrical and Electronics Engineers
Number of pages6
ISBN (Print)9781467361293
DOIs
Publication statusPublished - 1 Dec 2013
Externally publishedYes
EventIEEE International Joint Conference on Neural Networks 2013 - Dallas, United States of America
Duration: 4 Aug 20139 Aug 2013
https://ieeexplore.ieee.org/xpl/conhome/6691896/proceeding (Proceedings)

Publication series

NameProceedings of the International Joint Conference on Neural Networks

Conference

ConferenceIEEE International Joint Conference on Neural Networks 2013
Abbreviated titleIJCNN 2013
Country/TerritoryUnited States of America
CityDallas
Period4/08/139/08/13
Internet address

Cite this