Abstract
Kernel Generalised Learning Vector Quantisation (KGLVQ) was proposed to extend Generalised Learning Vector Quantisation into the kernel feature space to deal with complex class boundaries and thus yielded promising performance for complex classification tasks in pattern recognition. However KGLVQ does not follow the maximal margin principle, which is crucial for kernel-based learning methods. In this paper we propose a maximal margin approach (MLVQ) to the KGLVQ algorithm. MLVQ inherits the merits of KGLVQ and also follows the maximal margin principle to improve the generalisation capability. Experiments performed on the well-known data sets available in UCI repository show promising classification results for the proposed method.
Original language | English |
---|---|
Title of host publication | 2013 International Joint Conference on Neural Networks, IJCNN 2013 |
Publisher | IEEE, Institute of Electrical and Electronics Engineers |
Number of pages | 6 |
ISBN (Print) | 9781467361293 |
DOIs | |
Publication status | Published - 1 Dec 2013 |
Externally published | Yes |
Event | IEEE International Joint Conference on Neural Networks 2013 - Dallas, United States of America Duration: 4 Aug 2013 → 9 Aug 2013 https://ieeexplore.ieee.org/xpl/conhome/6691896/proceeding (Proceedings) |
Publication series
Name | Proceedings of the International Joint Conference on Neural Networks |
---|
Conference
Conference | IEEE International Joint Conference on Neural Networks 2013 |
---|---|
Abbreviated title | IJCNN 2013 |
Country/Territory | United States of America |
City | Dallas |
Period | 4/08/13 → 9/08/13 |
Internet address |