Least square Support Vector Machine for large-scale dataset

Khanh Nguyen, Trung Le, Vinh Lai, Duy Nguyen, Dat Tran, Wanli Ma

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

4 Citations (Scopus)


Support Vector Machine (SVM) is a very well-known tool for classification and regression problems. Many applications require SVMs with non-linear kernels for accurate classification. Training time complexity for SVMs with non-linear kernels is typically quadratic in the size of the training dataset. In this paper, we depart from the very well-known variation of SVM, the so-called Least Square Support Vector Machine, and apply Steepest Sub-gradient Descent method to propose Steepest Sub-gradient Descent Least Square Support Vector Machine (SGDLSSVM). It is theoretically proven that the convergent rate of the proposed method to gain ε - precision solution is O (log (1/ε)). The experiments established on the large-scale datasets indicate that the proposed method offers the comparable classification accuracies while being faster than the baselines.

Original languageEnglish
Title of host publication2015 International Joint Conference on Neural Networks (IJCNN 2015)
EditorsHaibo He, Asim Roy
Place of PublicationPiscataway NJ USA
PublisherIEEE, Institute of Electrical and Electronics Engineers
Number of pages8
ISBN (Electronic)9781479919604, 9781479919598
ISBN (Print)9781479919611
Publication statusPublished - 2015
Externally publishedYes
EventIEEE International Joint Conference on Neural Networks 2015 - Killarney Ireland, Killarney, Ireland
Duration: 12 Jul 201517 Jul 2015
https://ieeexplore.ieee.org/xpl/conhome/7256526/proceeding (Proceedings)


ConferenceIEEE International Joint Conference on Neural Networks 2015
Abbreviated titleIJCNN 2015
Internet address


  • kernel method
  • solver
  • steepest gradient descent
  • Support Vector Machine

Cite this