270
Views
8
CrossRef citations to date
0
Altmetric
Original Articles

Least squares twin support vector machine classification via maximum one-class within class variance

, &
Pages 53-69 | Received 20 Aug 2009, Accepted 23 Jul 2010, Published online: 18 Aug 2010
 

Abstract

A twin support vector machine (TWSVM), as an effective classification tool, tries to find two non-parallel planes that can be produced by solving two quadratic programming problems (QPPs). The QPPs lead to higher computational costs. The least squares twin SVM (LSTSVM), as a variant of TWSVM, attempts to avoid the above deficiency and obtain two non-parallel planes directly by solving two sets of linear equations. Both TWSVM and LSTSVM operate directly on patterns using two optimizations with constraints and, respectively, use such constraints to estimate the distance between each plane for its own class and patterns of other classes. However, such approaches weaken the geometric interpretation of the generalized proximal SVM (GEPSVM) so that in many Exclusive Or examples for different distributions, they may obtain the worse classification performance. Moreover, in addition to failing to discover the local geometry inside the samples, they are sensitive to outliers. In this paper, inspired by several geometrically motivated learning algorithms and the advantages of LSTSVM, we first propose a new classifier, called LSTSVM classification via maximum one-class within-class variance (MWSVM), which is specially designed for avoiding the aforementioned deficiencies and keeping the advantages of LSTSVM. The new method directly incorporates the one-class within-class variance to the classifier, such that it is expected that the genuine geometric interpretation of GEPSVM can be kept in LSTSVM. Undoubtedly, like LSTSVM, MWSVM may lead to a worse classification performance in many cases, especially when the outliers are present. Therefore, a localized version (LMWSVM) of MWSVM is further proposed to remove the outliers effectively. Another advantage of LMWSVM is that it takes the nearby points which are closest to each other as a training set, such that the MWSVM classifier is determined by smaller size of training samples than that of LSTSVM. Naturally, it can reduce the storage space of LSTSVM, especially when extended to nonlinear cases. Experiments carried out on both toy and real-world problems disclose the effectiveness of both MWSVM and LMWSVM.

Acknowledgements

The work is supported by the Scientific and Technological Innovation Foundation of Jiangsu province in China No. BK2009393, the Scientific and Technological Innovation Foundation of Nanjing Forestry University, China, and the National Science Foundation in China No. 90820306.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.