Abstract
Support Vector Machine (SVM) is a well-known kernel-based method for binary classification problem. SVM aims at constructing the optimal middle hyperplane which induces the largest margin. It is proven that in a linearly separable case, this middle hyperplane offers the high accuracy on universal datasets. However, real world datasets often contain overlapping regions and therefore, the decision hyperplane should be adjusted according to the profiles of the datasets. In this paper, we propose Robust Support Vector Machine (RSVM), where the hyperplanes can be properly adjusted to accommodate the real world datasets. By setting the value of the adjustment factor properly, RSVM can handle well the datasets with any possible profiles. Our experiments on the benchmark datasets demonstrate the superiority of the RSVM for both binary and one-class classification problems.
Original language | English |
---|---|
Title of host publication | Proceedings of the 2014 International Joint Conference on Neural Networks |
Editors | Cesare Alippi |
Place of Publication | Piscataway NJ USA |
Publisher | IEEE, Institute of Electrical and Electronics Engineers |
Pages | 4137-4144 |
Number of pages | 8 |
ISBN (Electronic) | 9781479914845, 9781479966271 |
ISBN (Print) | 9781479914821 |
DOIs | |
Publication status | Published - 2014 |
Externally published | Yes |
Event | IEEE International Joint Conference on Neural Networks 2014 - Beijing, China Duration: 6 Jul 2014 → 11 Jul 2014 https://ieeexplore.ieee.org/xpl/conhome/6880678/proceeding (Proceedings) |
Conference
Conference | IEEE International Joint Conference on Neural Networks 2014 |
---|---|
Abbreviated title | IJCNN 2014 |
Country/Territory | China |
City | Beijing |
Period | 6/07/14 → 11/07/14 |
Internet address |
Keywords
- Kernel-based method
- One-class Support Vector Machine
- Support Vector Machine