Abstract
Kernel Generalised Learning Vector Quantisation (KGLVQ) was proposed to extend Generalised Learning Vector Quantisation into the kernel feature space to deal with complex class boundaries and thus yield promising performance for complex classification tasks in pattern recognition. However KGLVQ does not follow the maximal margin principle which is crucial for kernel-based learning methods. In this paper we propose a maximal margin approach to Kernel Generalised Learning Vector Quantisation algorithm which inherits the merits of KGLVQ and follows the maximal margin principle to favour the generalisation capability. Experiments performed on the well-known data set III of BCI competition II show promising classification results for the proposed method.
Original language | English |
---|---|
Title of host publication | Neural Information Processing - 19th International Conference, ICONIP 2012, Proceedings |
Pages | 191-198 |
Number of pages | 8 |
Edition | PART 3 |
DOIs | |
Publication status | Published - 2012 |
Externally published | Yes |
Event | International Conference on Neural Information Processing 2012 - Doha, Qatar Duration: 12 Nov 2012 → 15 Nov 2012 Conference number: 19th https://link.springer.com/book/10.1007/978-3-642-34500-5 (Proceedings) |
Publication series
Name | Lecture Notes in Computer Science |
---|---|
Publisher | Springer |
Number | PART 3 |
Volume | 7665 |
ISSN (Print) | 0302-9743 |
ISSN (Electronic) | 1611-3349 |
Conference
Conference | International Conference on Neural Information Processing 2012 |
---|---|
Abbreviated title | ICONIP 2012 |
Country | Qatar |
City | Doha |
Period | 12/11/12 → 15/11/12 |
Internet address |
|
Keywords
- Generalised Learning Vector Quantisation
- Kernel Method
- Learning Vector Quantisation
- Maximising Margin